Dec 05 00:22:59 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 00:22:59 crc restorecon[4758]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:22:59 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 00:23:00 crc restorecon[4758]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 00:23:00 crc kubenswrapper[4759]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.901024 4759 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903408 4759 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903419 4759 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903424 4759 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903429 4759 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903433 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903436 4759 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903440 4759 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903444 4759 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903450 4759 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903454 4759 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903459 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903462 4759 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903466 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903469 4759 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903473 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903476 4759 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903480 4759 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903483 4759 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903493 4759 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903497 4759 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903502 4759 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903506 4759 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903510 4759 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903514 4759 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903518 4759 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903522 4759 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903526 4759 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903529 4759 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903533 4759 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903537 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903541 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903545 4759 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903550 4759 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903555 4759 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903559 4759 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903563 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903566 4759 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903570 4759 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903573 4759 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903576 4759 feature_gate.go:330] unrecognized feature gate: Example Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903580 4759 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903584 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903587 4759 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903591 4759 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903594 4759 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903597 4759 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903601 4759 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903604 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903608 4759 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903611 4759 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903615 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903618 4759 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903622 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903625 4759 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903629 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903633 4759 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903636 4759 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903640 4759 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903643 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903648 4759 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903653 4759 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903657 4759 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903661 4759 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903666 4759 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903671 4759 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903674 4759 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903678 4759 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903682 4759 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903686 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903690 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.903693 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903907 4759 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903917 4759 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903926 4759 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903932 4759 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903939 4759 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903943 4759 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903948 4759 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903953 4759 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903958 4759 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903962 4759 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903966 4759 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903970 4759 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903975 4759 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903979 4759 flags.go:64] FLAG: --cgroup-root="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903982 4759 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903986 4759 flags.go:64] FLAG: --client-ca-file="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903990 4759 flags.go:64] FLAG: --cloud-config="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903994 4759 flags.go:64] FLAG: --cloud-provider="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.903998 4759 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904008 4759 flags.go:64] FLAG: --cluster-domain="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904012 4759 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904016 4759 flags.go:64] FLAG: --config-dir="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904020 4759 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904025 4759 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904030 4759 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904034 4759 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904038 4759 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904043 4759 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904047 4759 flags.go:64] FLAG: --contention-profiling="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904051 4759 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904055 4759 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904059 4759 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904063 4759 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904068 4759 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904072 4759 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904076 4759 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904080 4759 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904084 4759 flags.go:64] FLAG: --enable-server="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904088 4759 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904093 4759 flags.go:64] FLAG: --event-burst="100" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904098 4759 flags.go:64] FLAG: --event-qps="50" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904102 4759 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904106 4759 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904110 4759 flags.go:64] FLAG: --eviction-hard="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904115 4759 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904119 4759 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904123 4759 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904127 4759 flags.go:64] FLAG: --eviction-soft="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904131 4759 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904135 4759 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904141 4759 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904145 4759 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904149 4759 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904153 4759 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904157 4759 flags.go:64] FLAG: --feature-gates="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904163 4759 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904167 4759 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904172 4759 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904176 4759 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904180 4759 flags.go:64] FLAG: --healthz-port="10248" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904185 4759 flags.go:64] FLAG: --help="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904189 4759 flags.go:64] FLAG: --hostname-override="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904193 4759 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904197 4759 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904201 4759 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904205 4759 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904209 4759 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904213 4759 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904217 4759 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904222 4759 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904225 4759 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904229 4759 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904234 4759 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904238 4759 flags.go:64] FLAG: --kube-reserved="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904242 4759 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904246 4759 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904250 4759 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904254 4759 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904258 4759 flags.go:64] FLAG: --lock-file="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904262 4759 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904266 4759 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904270 4759 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904277 4759 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904281 4759 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904285 4759 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904289 4759 flags.go:64] FLAG: --logging-format="text" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904293 4759 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904297 4759 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904314 4759 flags.go:64] FLAG: --manifest-url="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904319 4759 flags.go:64] FLAG: --manifest-url-header="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904325 4759 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904330 4759 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904335 4759 flags.go:64] FLAG: --max-pods="110" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904341 4759 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904347 4759 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904352 4759 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904358 4759 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904364 4759 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904369 4759 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904373 4759 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904383 4759 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904387 4759 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904392 4759 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904396 4759 flags.go:64] FLAG: --pod-cidr="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904400 4759 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904406 4759 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904411 4759 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904416 4759 flags.go:64] FLAG: --pods-per-core="0" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904420 4759 flags.go:64] FLAG: --port="10250" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904424 4759 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904429 4759 flags.go:64] FLAG: --provider-id="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904433 4759 flags.go:64] FLAG: --qos-reserved="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904437 4759 flags.go:64] FLAG: --read-only-port="10255" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904441 4759 flags.go:64] FLAG: --register-node="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904445 4759 flags.go:64] FLAG: --register-schedulable="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904449 4759 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904456 4759 flags.go:64] FLAG: --registry-burst="10" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904460 4759 flags.go:64] FLAG: --registry-qps="5" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904464 4759 flags.go:64] FLAG: --reserved-cpus="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904468 4759 flags.go:64] FLAG: --reserved-memory="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904474 4759 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904478 4759 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904484 4759 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904488 4759 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904493 4759 flags.go:64] FLAG: --runonce="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904497 4759 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904502 4759 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904507 4759 flags.go:64] FLAG: --seccomp-default="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904511 4759 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904515 4759 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904519 4759 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904524 4759 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904528 4759 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904533 4759 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904537 4759 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904541 4759 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904545 4759 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904549 4759 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904553 4759 flags.go:64] FLAG: --system-cgroups="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904558 4759 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904564 4759 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904568 4759 flags.go:64] FLAG: --tls-cert-file="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904573 4759 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904577 4759 flags.go:64] FLAG: --tls-min-version="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904581 4759 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904585 4759 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904590 4759 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904594 4759 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904598 4759 flags.go:64] FLAG: --v="2" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904603 4759 flags.go:64] FLAG: --version="false" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904609 4759 flags.go:64] FLAG: --vmodule="" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904614 4759 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.904619 4759 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904722 4759 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904727 4759 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904731 4759 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904736 4759 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904745 4759 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904749 4759 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904753 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904758 4759 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904763 4759 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904767 4759 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904772 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904775 4759 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904779 4759 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904783 4759 feature_gate.go:330] unrecognized feature gate: Example Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904787 4759 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904792 4759 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904797 4759 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904801 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904805 4759 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904809 4759 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904814 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904818 4759 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904822 4759 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904826 4759 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904829 4759 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904833 4759 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904836 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904840 4759 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904844 4759 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904847 4759 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904851 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904854 4759 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904858 4759 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904862 4759 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904865 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904869 4759 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904875 4759 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904878 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904882 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904886 4759 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904889 4759 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904893 4759 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904897 4759 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904900 4759 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904903 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904907 4759 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904911 4759 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904915 4759 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904918 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904922 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904926 4759 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904930 4759 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904933 4759 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904937 4759 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904940 4759 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904944 4759 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904947 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904951 4759 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904955 4759 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904958 4759 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904962 4759 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904965 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904969 4759 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904973 4759 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904976 4759 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904980 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904984 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904988 4759 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904993 4759 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.904997 4759 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.905000 4759 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.905010 4759 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.912981 4759 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.913011 4759 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913062 4759 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913080 4759 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913087 4759 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913091 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913096 4759 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913100 4759 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913104 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913108 4759 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913112 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913115 4759 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913119 4759 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913122 4759 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913126 4759 feature_gate.go:330] unrecognized feature gate: Example Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913130 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913133 4759 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913137 4759 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913140 4759 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913144 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913147 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913151 4759 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913154 4759 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913158 4759 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913161 4759 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913165 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913170 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913174 4759 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913181 4759 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913187 4759 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913192 4759 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913197 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913202 4759 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913207 4759 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913211 4759 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913217 4759 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913221 4759 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913225 4759 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913229 4759 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913233 4759 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913237 4759 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913241 4759 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913245 4759 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913249 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913253 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913258 4759 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913262 4759 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913267 4759 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913271 4759 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913274 4759 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913278 4759 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913282 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913286 4759 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913290 4759 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913297 4759 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913318 4759 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913324 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913328 4759 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913331 4759 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913335 4759 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913339 4759 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913342 4759 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913346 4759 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913350 4759 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913354 4759 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913357 4759 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913361 4759 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913365 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913369 4759 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913372 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913376 4759 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913379 4759 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913383 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.913388 4759 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913494 4759 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913500 4759 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913504 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913508 4759 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913512 4759 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913516 4759 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913520 4759 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913524 4759 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913528 4759 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913531 4759 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913535 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913539 4759 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913543 4759 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913548 4759 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913552 4759 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913556 4759 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913560 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913563 4759 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913567 4759 feature_gate.go:330] unrecognized feature gate: Example Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913571 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913575 4759 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913581 4759 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913585 4759 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913590 4759 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913594 4759 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913598 4759 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913602 4759 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913606 4759 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913610 4759 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913614 4759 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913617 4759 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913621 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913624 4759 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913628 4759 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913632 4759 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913636 4759 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913641 4759 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913645 4759 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913649 4759 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913653 4759 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913657 4759 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913662 4759 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913666 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913670 4759 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913674 4759 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913678 4759 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913683 4759 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913687 4759 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913691 4759 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913696 4759 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913701 4759 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913705 4759 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913708 4759 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913712 4759 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913716 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913720 4759 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913723 4759 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913727 4759 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913730 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913734 4759 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913738 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913742 4759 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913746 4759 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913749 4759 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913753 4759 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913757 4759 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913761 4759 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913765 4759 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913768 4759 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913772 4759 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.913776 4759 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.913781 4759 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.914016 4759 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.917595 4759 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.917670 4759 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.918116 4759 server.go:997] "Starting client certificate rotation" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.918134 4759 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.918450 4759 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-22 22:43:04.181542459 +0000 UTC Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.918539 4759 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 430h20m3.263006754s for next certificate rotation Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.922518 4759 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.924486 4759 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.935332 4759 log.go:25] "Validated CRI v1 runtime API" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.958370 4759 log.go:25] "Validated CRI v1 image API" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.959977 4759 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.962124 4759 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-00-18-22-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.962155 4759 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.978734 4759 manager.go:217] Machine: {Timestamp:2025-12-05 00:23:00.977271272 +0000 UTC m=+0.192932242 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:26c1e85a-f767-4d62-bae3-5a75555c0ad9 BootID:3291f5b6-2cb9-45e1-be99-f12561717489 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e6:86:bd Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e6:86:bd Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:df:fc:88 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a5:71:05 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1e:9b:8b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6c:7f:5f Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:83:ce:00 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:cc:73:fa:01:8f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:96:91:83:0b:bb:c9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.979292 4759 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.979551 4759 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.980683 4759 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.981853 4759 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.981944 4759 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.982279 4759 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.982295 4759 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.982520 4759 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.982578 4759 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.983011 4759 state_mem.go:36] "Initialized new in-memory state store" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.983152 4759 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.984149 4759 kubelet.go:418] "Attempting to sync node with API server" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.984180 4759 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.984215 4759 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.984238 4759 kubelet.go:324] "Adding apiserver pod source" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.984255 4759 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.986410 4759 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.987156 4759 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.987297 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.987329 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.987381 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.987404 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.987834 4759 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988285 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988327 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988335 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988343 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988355 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988362 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988370 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988381 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988389 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988397 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988407 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988415 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.988749 4759 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.989172 4759 server.go:1280] "Started kubelet" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.989476 4759 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.989557 4759 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.989550 4759 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.990257 4759 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.990792 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.990863 4759 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.990999 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 06:54:27.19037293 +0000 UTC Dec 05 00:23:00 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.991152 4759 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.991387 4759 server.go:460] "Adding debug handlers to kubelet server" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.991761 4759 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.991793 4759 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.991843 4759 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.992342 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="200ms" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.992381 4759 factory.go:55] Registering systemd factory Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.992404 4759 factory.go:221] Registration of the systemd container factory successfully Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.992209 4759 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.150:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e29e824030bff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 00:23:00.989144063 +0000 UTC m=+0.204805003,LastTimestamp:2025-12-05 00:23:00.989144063 +0000 UTC m=+0.204805003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 00:23:00 crc kubenswrapper[4759]: W1205 00:23:00.993289 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:00 crc kubenswrapper[4759]: E1205 00:23:00.993381 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995024 4759 factory.go:153] Registering CRI-O factory Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995050 4759 factory.go:221] Registration of the crio container factory successfully Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995113 4759 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995136 4759 factory.go:103] Registering Raw factory Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995152 4759 manager.go:1196] Started watching for new ooms in manager Dec 05 00:23:00 crc kubenswrapper[4759]: I1205 00:23:00.995738 4759 manager.go:319] Starting recovery of all containers Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004458 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004515 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004526 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004536 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004545 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004555 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004565 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004577 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004588 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004598 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004607 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004616 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004625 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004638 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004648 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004658 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004668 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004678 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004686 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004694 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004703 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004713 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004721 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004728 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004737 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004746 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004762 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004775 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004787 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004802 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004813 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004823 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004834 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004851 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004863 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004876 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004896 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004909 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004924 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004937 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004950 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004963 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004975 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.004989 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005001 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005013 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005026 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005039 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005052 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005065 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005077 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005093 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005111 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005126 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005139 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005153 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005166 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005182 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005196 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005209 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005220 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005232 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005245 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005259 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005278 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005296 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005330 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005349 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005369 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005387 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005406 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005421 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005435 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005448 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005462 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005476 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005490 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005504 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.005517 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006461 4759 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006494 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006512 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006530 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006545 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006559 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006574 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006590 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006605 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006618 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006632 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006647 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006663 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006677 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006695 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006710 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006728 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006743 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006758 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006776 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006834 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006856 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006869 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006884 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006896 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006909 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006932 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006948 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006963 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006981 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.006999 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007016 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007031 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007047 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007063 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007081 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007096 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007112 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007128 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007142 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007157 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007171 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007186 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007201 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007214 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007228 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007241 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007255 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007269 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007281 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007294 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007361 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007378 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007393 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007406 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007420 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007435 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007450 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007464 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007477 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007489 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007505 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007520 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007534 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007547 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007561 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007575 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007589 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007604 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007617 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007632 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007645 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007659 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007672 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007684 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007698 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007711 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007724 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007739 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007755 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007771 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007786 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007803 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007819 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007835 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007850 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007865 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007879 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007893 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007908 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007924 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007940 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007955 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007972 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.007987 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008002 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008018 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008033 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008047 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008068 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008085 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008099 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008113 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008126 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008138 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008149 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008159 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008169 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008179 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008189 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008198 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008207 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008220 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008234 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008252 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008268 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008280 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008291 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008321 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008336 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008353 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008375 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008390 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008407 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008423 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008438 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008453 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008466 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008481 4759 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008493 4759 reconstruct.go:97] "Volume reconstruction finished" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.008504 4759 reconciler.go:26] "Reconciler: start to sync state" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.017292 4759 manager.go:324] Recovery completed Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.028338 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.030028 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.030109 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.030122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.031903 4759 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.031921 4759 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.031948 4759 state_mem.go:36] "Initialized new in-memory state store" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.049980 4759 policy_none.go:49] "None policy: Start" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.051487 4759 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.051672 4759 state_mem.go:35] "Initializing new in-memory state store" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.092248 4759 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.150519 4759 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.153773 4759 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.154402 4759 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.154487 4759 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.154556 4759 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.155786 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.155840 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.192643 4759 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.192773 4759 manager.go:334] "Starting Device Plugin manager" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.193087 4759 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.193116 4759 server.go:79] "Starting device plugin registration server" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.193211 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="400ms" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.193690 4759 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.193721 4759 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.194158 4759 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.194394 4759 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.194407 4759 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.202056 4759 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.255439 4759 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.255579 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257012 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257058 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257229 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257831 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.257873 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258528 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258582 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258744 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258912 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.258945 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259707 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259742 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259707 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259779 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259788 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259873 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.259985 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260018 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260645 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260685 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.260879 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261029 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261062 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261348 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261389 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261406 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261890 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261907 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261924 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261938 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.261928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.262024 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.262093 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.262119 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.263190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.263269 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.263289 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.293915 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.295583 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.295613 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.295627 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.295656 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.296203 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.150:6443: connect: connection refused" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.312867 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.312919 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.312957 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313054 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313129 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313193 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313245 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313293 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313346 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313375 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313394 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313412 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313433 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.313474 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415350 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415416 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415442 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415465 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415490 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415507 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415525 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415545 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415565 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415582 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415600 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415605 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415669 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415622 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415724 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415757 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415778 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415813 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415816 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415849 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415861 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415895 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415926 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415956 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415984 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.416009 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.415788 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.416046 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.416076 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.416105 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.496565 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.501097 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.501155 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.501172 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.501203 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.501772 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.150:6443: connect: connection refused" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.594421 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.594822 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="800ms" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.603956 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.625479 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e0c7c9db789851ea32a35abfacf5d6eebba7bd5d758b68a2464d3f715023152a WatchSource:0}: Error finding container e0c7c9db789851ea32a35abfacf5d6eebba7bd5d758b68a2464d3f715023152a: Status 404 returned error can't find the container with id e0c7c9db789851ea32a35abfacf5d6eebba7bd5d758b68a2464d3f715023152a Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.629905 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.630267 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6d1ed67c2e2174e01c2d41ff6d0cc80cce58e6a32ebd7386e44c59be34df939e WatchSource:0}: Error finding container 6d1ed67c2e2174e01c2d41ff6d0cc80cce58e6a32ebd7386e44c59be34df939e: Status 404 returned error can't find the container with id 6d1ed67c2e2174e01c2d41ff6d0cc80cce58e6a32ebd7386e44c59be34df939e Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.653803 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.661801 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.723594 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7b29bf4cadc66c034bc37c9446c93d176a1623e277a4e8fc3f0616377ad51701 WatchSource:0}: Error finding container 7b29bf4cadc66c034bc37c9446c93d176a1623e277a4e8fc3f0616377ad51701: Status 404 returned error can't find the container with id 7b29bf4cadc66c034bc37c9446c93d176a1623e277a4e8fc3f0616377ad51701 Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.729674 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-548b9cc2a14c0034b882fb11f8d79ab7ece3204943f9516c51bd6ec5a350af82 WatchSource:0}: Error finding container 548b9cc2a14c0034b882fb11f8d79ab7ece3204943f9516c51bd6ec5a350af82: Status 404 returned error can't find the container with id 548b9cc2a14c0034b882fb11f8d79ab7ece3204943f9516c51bd6ec5a350af82 Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.734932 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4b2498d83cf5365d729aed8fd347986fd016c35bdb0eaf6566feac35d39a0709 WatchSource:0}: Error finding container 4b2498d83cf5365d729aed8fd347986fd016c35bdb0eaf6566feac35d39a0709: Status 404 returned error can't find the container with id 4b2498d83cf5365d729aed8fd347986fd016c35bdb0eaf6566feac35d39a0709 Dec 05 00:23:01 crc kubenswrapper[4759]: W1205 00:23:01.888809 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.888878 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.902550 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.904327 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.904355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.904364 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.904426 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: E1205 00:23:01.904739 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.150:6443: connect: connection refused" node="crc" Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.990045 4759 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:01 crc kubenswrapper[4759]: I1205 00:23:01.992105 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 23:20:24.087620328 +0000 UTC Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.162073 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.162191 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b2498d83cf5365d729aed8fd347986fd016c35bdb0eaf6566feac35d39a0709"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.164243 4759 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef" exitCode=0 Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.164332 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.164367 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"548b9cc2a14c0034b882fb11f8d79ab7ece3204943f9516c51bd6ec5a350af82"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.164500 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.165633 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.165669 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.165709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.166491 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5" exitCode=0 Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.166547 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.166631 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b29bf4cadc66c034bc37c9446c93d176a1623e277a4e8fc3f0616377ad51701"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.166758 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.167698 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.167714 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.167722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.168123 4759 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314" exitCode=0 Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.168208 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.168347 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e0c7c9db789851ea32a35abfacf5d6eebba7bd5d758b68a2464d3f715023152a"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.168431 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.169152 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.169177 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.169188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.170290 4759 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd" exitCode=0 Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.170343 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.170370 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d1ed67c2e2174e01c2d41ff6d0cc80cce58e6a32ebd7386e44c59be34df939e"} Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.170475 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.170499 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.173499 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.173553 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.173564 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.173828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.173940 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.174116 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: W1205 00:23:02.260831 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:02 crc kubenswrapper[4759]: E1205 00:23:02.261102 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:02 crc kubenswrapper[4759]: W1205 00:23:02.306363 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:02 crc kubenswrapper[4759]: E1205 00:23:02.306443 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:02 crc kubenswrapper[4759]: E1205 00:23:02.396671 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="1.6s" Dec 05 00:23:02 crc kubenswrapper[4759]: W1205 00:23:02.433255 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:02 crc kubenswrapper[4759]: E1205 00:23:02.433396 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.714111 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.715822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.715880 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.715893 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.715920 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:02 crc kubenswrapper[4759]: E1205 00:23:02.716476 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.150:6443: connect: connection refused" node="crc" Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.990842 4759 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:02 crc kubenswrapper[4759]: I1205 00:23:02.992199 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:55:12.413276144 +0000 UTC Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.199291 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.199472 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.205425 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.205466 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.207748 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8cbb649eb107e2fd5b9575d770967c103b7f599f57b6f71d8af4b940a1bc0be0"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.207838 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.208963 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.209002 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.209016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.210725 4759 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74" exitCode=0 Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.210822 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.210949 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.211777 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.211815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.211829 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.212901 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6"} Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.212927 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb"} Dec 05 00:23:03 crc kubenswrapper[4759]: E1205 00:23:03.584699 4759 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.150:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e29e824030bff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 00:23:00.989144063 +0000 UTC m=+0.204805003,LastTimestamp:2025-12-05 00:23:00.989144063 +0000 UTC m=+0.204805003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.990410 4759 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:03 crc kubenswrapper[4759]: I1205 00:23:03.992458 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:47:10.239978186 +0000 UTC Dec 05 00:23:03 crc kubenswrapper[4759]: E1205 00:23:03.998395 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="3.2s" Dec 05 00:23:04 crc kubenswrapper[4759]: W1205 00:23:04.185498 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:04 crc kubenswrapper[4759]: E1205 00:23:04.185598 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:04 crc kubenswrapper[4759]: W1205 00:23:04.259376 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:04 crc kubenswrapper[4759]: E1205 00:23:04.259452 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.317531 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.318821 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.318859 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.318868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.318889 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:04 crc kubenswrapper[4759]: E1205 00:23:04.319610 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.150:6443: connect: connection refused" node="crc" Dec 05 00:23:04 crc kubenswrapper[4759]: W1205 00:23:04.870599 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:23:04 crc kubenswrapper[4759]: E1205 00:23:04.870910 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.150:6443: connect: connection refused" logger="UnhandledError" Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.993011 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:06:18.143738289 +0000 UTC Dec 05 00:23:04 crc kubenswrapper[4759]: I1205 00:23:04.993044 4759 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 699h43m13.150696876s for next certificate rotation Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.219994 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.220040 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.220059 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.220128 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221430 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221453 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221462 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221635 4759 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39" exitCode=0 Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221686 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.221791 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.222572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.222609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.222618 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.225951 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.226039 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.226906 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.226937 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.226951 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.228553 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe"} Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.228625 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.229323 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.229366 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.229375 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.904834 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:05 crc kubenswrapper[4759]: I1205 00:23:05.909736 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.076246 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238836 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238872 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00"} Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238918 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804"} Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238937 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b"} Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238959 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239097 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239406 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.238837 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239513 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.239840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240061 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240070 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240786 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.240839 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:06 crc kubenswrapper[4759]: I1205 00:23:06.484539 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.245864 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11"} Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.245909 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.245929 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.245937 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3"} Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.245999 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.246078 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.246945 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.246972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.246983 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.246945 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.247032 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.247041 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.247566 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.247582 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.247589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.249618 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.249658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.249674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.520136 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.521354 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.521421 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.521444 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:07 crc kubenswrapper[4759]: I1205 00:23:07.521485 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.250997 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.251035 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.251897 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.251923 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.251931 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.252678 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.252709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.252720 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:08 crc kubenswrapper[4759]: I1205 00:23:08.277968 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 00:23:09 crc kubenswrapper[4759]: I1205 00:23:09.254435 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:09 crc kubenswrapper[4759]: I1205 00:23:09.255745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:09 crc kubenswrapper[4759]: I1205 00:23:09.255789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:09 crc kubenswrapper[4759]: I1205 00:23:09.255804 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:11 crc kubenswrapper[4759]: E1205 00:23:11.202148 4759 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.630717 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.630899 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.632059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.632091 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.632102 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:12 crc kubenswrapper[4759]: I1205 00:23:12.634341 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:13 crc kubenswrapper[4759]: I1205 00:23:13.264169 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:13 crc kubenswrapper[4759]: I1205 00:23:13.326120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:13 crc kubenswrapper[4759]: I1205 00:23:13.326158 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:13 crc kubenswrapper[4759]: I1205 00:23:13.326171 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.010405 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.266693 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.267620 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.267819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.267953 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:14 crc kubenswrapper[4759]: I1205 00:23:14.991986 4759 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 00:23:15 crc kubenswrapper[4759]: W1205 00:23:15.095779 4759 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.095890 4759 trace.go:236] Trace[1493287551]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 00:23:05.094) (total time: 10001ms): Dec 05 00:23:15 crc kubenswrapper[4759]: Trace[1493287551]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:23:15.095) Dec 05 00:23:15 crc kubenswrapper[4759]: Trace[1493287551]: [10.001810742s] [10.001810742s] END Dec 05 00:23:15 crc kubenswrapper[4759]: E1205 00:23:15.095916 4759 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.663533 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.663721 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.664616 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.664654 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.664667 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:15 crc kubenswrapper[4759]: I1205 00:23:15.688872 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.076510 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.076610 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.280687 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.282541 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.282577 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.282585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.299626 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.436948 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 00:23:16 crc kubenswrapper[4759]: I1205 00:23:16.436997 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.010287 4759 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.010378 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.282883 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.283779 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.283855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:17 crc kubenswrapper[4759]: I1205 00:23:17.283874 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:19 crc kubenswrapper[4759]: I1205 00:23:19.766086 4759 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.081468 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.081650 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.082669 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.082707 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.082718 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.086795 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:21 crc kubenswrapper[4759]: E1205 00:23:21.202236 4759 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.322941 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.323901 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.323938 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.323948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:21 crc kubenswrapper[4759]: E1205 00:23:21.427108 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.432390 4759 trace.go:236] Trace[122808157]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 00:23:09.351) (total time: 12080ms): Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[122808157]: ---"Objects listed" error: 12080ms (00:23:21.432) Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[122808157]: [12.080829451s] [12.080829451s] END Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.432416 4759 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.435599 4759 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.435795 4759 trace.go:236] Trace[1197137319]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 00:23:10.126) (total time: 11309ms): Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[1197137319]: ---"Objects listed" error: 11308ms (00:23:21.435) Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[1197137319]: [11.309090549s] [11.309090549s] END Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.435826 4759 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.437240 4759 trace.go:236] Trace[1306066521]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 00:23:08.558) (total time: 12878ms): Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[1306066521]: ---"Objects listed" error: 12878ms (00:23:21.437) Dec 05 00:23:21 crc kubenswrapper[4759]: Trace[1306066521]: [12.878407618s] [12.878407618s] END Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.437272 4759 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 00:23:21 crc kubenswrapper[4759]: E1205 00:23:21.437561 4759 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488176 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38026->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488258 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38026->192.168.126.11:17697: read: connection reset by peer" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488201 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38022->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488504 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38022->192.168.126.11:17697: read: connection reset by peer" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488659 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.488713 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 00:23:21 crc kubenswrapper[4759]: I1205 00:23:21.997337 4759 apiserver.go:52] "Watching apiserver" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.000938 4759 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.001370 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-7lmmf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.001782 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.001906 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.001964 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.002048 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.002099 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.002130 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.002291 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.002502 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.002582 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.002691 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.004456 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.004717 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005231 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005444 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005854 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005939 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005973 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005934 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005993 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.005976 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.006930 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.007678 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.037219 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.048553 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.048842 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.093134 4759 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.101396 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.111922 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.121805 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.133577 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139711 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139767 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139792 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139816 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139837 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139861 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139886 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139909 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139932 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139959 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.139989 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140013 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140036 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140059 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140085 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140107 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140151 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140187 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140421 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140448 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140441 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140472 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140521 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140547 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140585 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140570 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140683 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140712 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140748 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140802 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140939 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.140981 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141006 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141002 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141030 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141034 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141055 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141095 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141121 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141149 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141218 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141244 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141268 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141292 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141338 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141363 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141384 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141412 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141426 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141434 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141440 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141434 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141561 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141594 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141630 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141666 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141692 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141720 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141744 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141871 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141473 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141902 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141514 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141909 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141595 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141901 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141695 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141955 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141972 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141732 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141740 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141805 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141840 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141910 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142125 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142143 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142161 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142196 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142213 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142188 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142243 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.141960 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142284 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142296 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142352 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142378 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142397 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142407 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142404 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142423 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142447 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142475 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142493 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142501 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142565 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142597 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142602 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142659 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142690 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142720 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142755 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142790 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142615 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142677 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142709 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142785 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.142830 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:22.642807134 +0000 UTC m=+21.858468164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143104 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143133 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143137 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143146 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143153 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143190 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143207 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143224 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143242 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143249 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143261 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143284 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143320 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143341 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143358 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143376 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143402 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143424 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143442 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143458 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143473 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143490 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143508 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143528 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143543 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143559 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143576 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143595 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143611 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143627 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143643 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143659 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143676 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143691 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143708 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143727 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143748 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143770 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143788 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143808 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143823 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143845 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143862 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143876 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143895 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143911 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143927 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143944 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143959 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143975 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143991 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144008 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144024 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144040 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144057 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144073 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144089 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144105 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144121 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144139 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144157 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144172 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144187 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144203 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144217 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144232 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144247 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144262 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144277 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144295 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145061 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145084 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145102 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145117 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145142 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145159 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145174 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145190 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145205 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145224 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145241 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145259 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145277 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145294 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145357 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145379 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145399 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145420 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145437 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145453 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145473 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145491 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145509 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145526 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145542 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145593 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145615 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145631 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145649 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145666 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145682 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145698 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145714 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145730 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145745 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145764 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145789 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145807 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145823 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145838 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145855 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145898 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145919 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145937 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145953 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145969 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145985 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146000 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146016 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146032 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146048 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146064 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146081 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146100 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146118 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146135 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146150 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146165 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146182 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146198 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146234 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146258 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146282 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146316 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146335 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146353 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146374 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk77p\" (UniqueName: \"kubernetes.io/projected/e42923da-2632-4c20-a3e8-26d46dccd346-kube-api-access-kk77p\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146395 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146413 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147898 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147921 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147941 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147957 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147973 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e42923da-2632-4c20-a3e8-26d46dccd346-hosts-file\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147993 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148008 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148070 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148088 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148104 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148114 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148124 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148133 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148142 4759 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148152 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148161 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148172 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148183 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148194 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148204 4759 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148214 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148224 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148234 4759 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148245 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148255 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148264 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148274 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148283 4759 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148293 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148316 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148326 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148336 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148345 4759 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148354 4759 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148363 4759 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148373 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148382 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148391 4759 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148417 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148426 4759 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148435 4759 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148445 4759 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148455 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148465 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148476 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148486 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148494 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.148504 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149573 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143320 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156731 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143413 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142871 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143036 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143463 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156851 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.142840 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143458 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143682 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143874 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.143897 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144048 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.144108 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145405 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145448 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.145541 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146761 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146777 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146793 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146946 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146974 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.146992 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147011 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147070 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147108 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147133 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147215 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147249 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147282 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147388 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147442 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147569 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147634 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.147720 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149618 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149758 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149930 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149964 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.149984 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.150198 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.150389 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.150554 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.150921 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.150942 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.151843 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.151864 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152075 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152215 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152360 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152515 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152567 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152603 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152633 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.152994 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.153027 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.153035 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.153156 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.153556 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.155816 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156103 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156463 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156463 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156612 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157166 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156658 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.156879 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157052 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157072 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157233 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157242 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157078 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157120 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157346 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157507 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157572 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157821 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.157620 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.158238 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.158226 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.158574 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159047 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159388 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159514 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159533 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159754 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.159895 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.160123 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.160360 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.160366 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.161500 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.162767 4759 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.164159 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.164461 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.165429 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.165436 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.165630 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.165941 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166050 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166236 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166464 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166494 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166645 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166675 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.166691 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.166770 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:22.6667414 +0000 UTC m=+21.882402370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.166849 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.166797 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.166884 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:22.666874465 +0000 UTC m=+21.882535425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167060 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167086 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167351 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167392 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167396 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167427 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167611 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167697 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.167709 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.168497 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.168567 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.168615 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.168764 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.168974 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169409 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169625 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169875 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169878 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.169996 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.170046 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.170289 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.171080 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.171207 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.172668 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.173925 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.175436 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.175963 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.177181 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.177635 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.178949 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.180763 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.181022 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.180608 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.181389 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.181465 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.182062 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.183444 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184284 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184332 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184346 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184395 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:22.684380083 +0000 UTC m=+21.900041033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.184648 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184798 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184827 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184841 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.184947 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.184987 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:22.684913456 +0000 UTC m=+21.900574396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.185602 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.185961 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.186244 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.186453 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.188140 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.188593 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.190935 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.193473 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.193583 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.193674 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.194053 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.194535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.194840 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.196501 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.201867 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.219742 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.221844 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250526 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk77p\" (UniqueName: \"kubernetes.io/projected/e42923da-2632-4c20-a3e8-26d46dccd346-kube-api-access-kk77p\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250584 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250613 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e42923da-2632-4c20-a3e8-26d46dccd346-hosts-file\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250641 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250696 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250711 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250723 4759 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250734 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250745 4759 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250756 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250767 4759 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250777 4759 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250788 4759 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250799 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250810 4759 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250820 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250829 4759 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250840 4759 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250849 4759 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250858 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250867 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250877 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250887 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250897 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250907 4759 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250917 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250930 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250940 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250951 4759 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250961 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250971 4759 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250981 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.250992 4759 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251003 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251014 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251026 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251036 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251046 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251056 4759 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251066 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251076 4759 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251089 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251100 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251111 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251121 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251132 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251156 4759 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251167 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251179 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251191 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251214 4759 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251225 4759 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251236 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251246 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251256 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251267 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251277 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251288 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251298 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251328 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251340 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251352 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251363 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251374 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251385 4759 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251395 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251405 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251416 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251428 4759 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251439 4759 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251449 4759 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251459 4759 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251469 4759 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251479 4759 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251491 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251503 4759 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251513 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251524 4759 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251535 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251545 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251556 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251566 4759 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251577 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251588 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251601 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251611 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251623 4759 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251635 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251647 4759 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251659 4759 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251670 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251681 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251692 4759 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251704 4759 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251715 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251725 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251735 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251746 4759 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251760 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251771 4759 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251782 4759 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251793 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251804 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251957 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251978 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.251990 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252009 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252021 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252034 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252045 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252057 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252069 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252271 4759 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252290 4759 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252323 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252334 4759 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252344 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252472 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252488 4759 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252499 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252509 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252519 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252530 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252540 4759 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252551 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252561 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252570 4759 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252572 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252581 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252664 4759 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252678 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252691 4759 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252703 4759 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252717 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252730 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252742 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252753 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252764 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252775 4759 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252787 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252800 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252812 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252824 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252837 4759 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252850 4759 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252862 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252874 4759 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252886 4759 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252898 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252911 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252922 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252934 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252946 4759 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.252959 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253055 4759 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253069 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253081 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253092 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253128 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.253128 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e42923da-2632-4c20-a3e8-26d46dccd346-hosts-file\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.277067 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk77p\" (UniqueName: \"kubernetes.io/projected/e42923da-2632-4c20-a3e8-26d46dccd346-kube-api-access-kk77p\") pod \"node-resolver-7lmmf\" (UID: \"e42923da-2632-4c20-a3e8-26d46dccd346\") " pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.320889 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.329452 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.330269 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.331418 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8" exitCode=255 Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.331460 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8"} Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.340612 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7lmmf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.341943 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.348076 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.355580 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.378380 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.403163 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.414991 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wdk4j"] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.416072 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-llpn6"] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.416570 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.416662 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.423180 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.423527 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.423539 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.432294 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.434478 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.434692 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.435459 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.435821 4759 scope.go:117] "RemoveContainer" containerID="7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.436196 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.436543 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.448126 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.470145 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.496149 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.507668 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.543607 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.553624 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.554985 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-os-release\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555034 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-netns\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-kubelet\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555078 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-bin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555099 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-multus\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555121 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-daemon-config\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555141 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555160 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-os-release\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555180 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-cni-binary-copy\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555199 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-hostroot\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555217 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-cnibin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555235 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-conf-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555317 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-socket-dir-parent\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555338 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-multus-certs\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555397 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-system-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555417 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-k8s-cni-cncf-io\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555488 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-system-cni-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555507 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cnibin\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555546 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555564 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqmrs\" (UniqueName: \"kubernetes.io/projected/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-kube-api-access-kqmrs\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555585 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-etc-kubernetes\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.555602 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwnj\" (UniqueName: \"kubernetes.io/projected/b33957c4-8ef0-4b57-8e3c-183091f3b022-kube-api-access-6wwnj\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.572223 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.586077 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.598066 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.615636 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.627382 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.641427 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656521 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656617 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-os-release\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.656646 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:23.65663028 +0000 UTC m=+22.872291230 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656668 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-kubelet\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656695 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-netns\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656713 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-daemon-config\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656730 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-bin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656759 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-multus\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656774 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656788 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-os-release\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656802 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-cni-binary-copy\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-hostroot\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656846 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-cnibin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656859 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-conf-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656891 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-socket-dir-parent\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656906 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-multus-certs\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656925 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-system-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656928 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-os-release\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656962 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-k8s-cni-cncf-io\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656979 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656988 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-os-release\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.656996 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-system-cni-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657026 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cnibin\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657043 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657046 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-netns\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657058 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wwnj\" (UniqueName: \"kubernetes.io/projected/b33957c4-8ef0-4b57-8e3c-183091f3b022-kube-api-access-6wwnj\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657071 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657086 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqmrs\" (UniqueName: \"kubernetes.io/projected/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-kube-api-access-kqmrs\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657102 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-etc-kubernetes\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657157 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-etc-kubernetes\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657298 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-bin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657369 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-cni-multus\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657586 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657716 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-socket-dir-parent\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657763 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-system-cni-dir\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657735 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657028 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-var-lib-kubelet\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657807 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cnibin\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657839 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-hostroot\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657840 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657876 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-cnibin\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657880 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-multus-certs\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657836 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-daemon-config\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657902 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-multus-conf-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657920 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-host-run-k8s-cni-cncf-io\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b33957c4-8ef0-4b57-8e3c-183091f3b022-system-cni-dir\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.657842 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b33957c4-8ef0-4b57-8e3c-183091f3b022-cni-binary-copy\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.658212 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.673074 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqmrs\" (UniqueName: \"kubernetes.io/projected/51a56acd-2ed2-498f-bcd1-93cd4ce2a21c-kube-api-access-kqmrs\") pod \"multus-additional-cni-plugins-wdk4j\" (UID: \"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\") " pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.673611 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wwnj\" (UniqueName: \"kubernetes.io/projected/b33957c4-8ef0-4b57-8e3c-183091f3b022-kube-api-access-6wwnj\") pod \"multus-llpn6\" (UID: \"b33957c4-8ef0-4b57-8e3c-183091f3b022\") " pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.754269 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.757558 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.757617 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.757652 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.757685 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757772 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757780 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757809 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757804 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757837 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:23.757813459 +0000 UTC m=+22.973474429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757815 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757844 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757918 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757822 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.757900 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:23.757885321 +0000 UTC m=+22.973546271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.758017 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:23.757997134 +0000 UTC m=+22.973658154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: E1205 00:23:22.758035 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:23.758027634 +0000 UTC m=+22.973688684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:22 crc kubenswrapper[4759]: W1205 00:23:22.766123 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a56acd_2ed2_498f_bcd1_93cd4ce2a21c.slice/crio-efdbb5ba5f2edcf27a657b04fa6cdd915cd30c0896c3da59d353669ea31b01fb WatchSource:0}: Error finding container efdbb5ba5f2edcf27a657b04fa6cdd915cd30c0896c3da59d353669ea31b01fb: Status 404 returned error can't find the container with id efdbb5ba5f2edcf27a657b04fa6cdd915cd30c0896c3da59d353669ea31b01fb Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.768521 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-llpn6" Dec 05 00:23:22 crc kubenswrapper[4759]: W1205 00:23:22.785597 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33957c4_8ef0_4b57_8e3c_183091f3b022.slice/crio-535ed246fe5452b66d53887b9ef7d7f7f0c00ceaf8b652f2338e15026e41ec7d WatchSource:0}: Error finding container 535ed246fe5452b66d53887b9ef7d7f7f0c00ceaf8b652f2338e15026e41ec7d: Status 404 returned error can't find the container with id 535ed246fe5452b66d53887b9ef7d7f7f0c00ceaf8b652f2338e15026e41ec7d Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.816398 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mbhwx"] Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.817341 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.822879 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.823170 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.823276 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.825193 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.825581 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.825660 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.825926 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.842111 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.913345 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.934414 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.946503 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959619 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959659 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959688 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959703 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959742 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959759 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959774 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959789 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959804 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959819 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959834 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959848 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959864 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959883 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959908 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67cz\" (UniqueName: \"kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959927 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959944 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959961 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.959979 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.966604 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.976506 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:22 crc kubenswrapper[4759]: I1205 00:23:22.992977 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:22Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.014216 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.046831 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060422 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060491 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060519 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060541 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060563 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060592 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060629 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060650 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060672 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060690 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060711 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060731 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060755 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060776 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060797 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060829 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060856 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060879 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060899 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67cz\" (UniqueName: \"kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060920 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.060984 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.061032 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.061062 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.061092 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.061687 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.061739 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062107 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062144 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062187 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062207 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062147 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062232 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062215 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062182 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062260 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062378 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.062950 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.063373 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.071812 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.075824 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.087828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67cz\" (UniqueName: \"kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz\") pod \"ovnkube-node-mbhwx\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.087833 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.155550 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.155632 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.155644 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.155724 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.155796 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.160102 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.160896 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.162901 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.164288 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.165048 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.166413 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.167719 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.168409 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.169516 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: W1205 00:23:23.169858 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45fa490b_1113_4ee6_9604_dc322ca11bd3.slice/crio-1cb4ef7ccc289586aa16f5b531c9a00f2cf64fb13c07df4d007ca3f84648072b WatchSource:0}: Error finding container 1cb4ef7ccc289586aa16f5b531c9a00f2cf64fb13c07df4d007ca3f84648072b: Status 404 returned error can't find the container with id 1cb4ef7ccc289586aa16f5b531c9a00f2cf64fb13c07df4d007ca3f84648072b Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.170242 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.170939 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.172497 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.173469 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.174636 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.175280 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.176667 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.177842 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.181102 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.181892 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.182509 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.183503 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.184080 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.185031 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.185795 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.186781 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.187423 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.188495 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.188960 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.189569 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.190741 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.191353 4759 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.191474 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.194295 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.194917 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.195601 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.198553 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.199748 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.200300 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.201276 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.202704 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.203692 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.204427 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.205623 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.206572 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.207029 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.207944 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.208524 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.209600 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.210048 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.210881 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.211348 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.211861 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.212886 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.213345 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.335008 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7lmmf" event={"ID":"e42923da-2632-4c20-a3e8-26d46dccd346","Type":"ContainerStarted","Data":"8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.335045 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7lmmf" event={"ID":"e42923da-2632-4c20-a3e8-26d46dccd346","Type":"ContainerStarted","Data":"1a64e8c147fef15283f58f5339597c63d741e79e3cdf40d268b79eb6265dcd64"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.337182 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerStarted","Data":"2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.337221 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerStarted","Data":"efdbb5ba5f2edcf27a657b04fa6cdd915cd30c0896c3da59d353669ea31b01fb"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.338899 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ba9d5f269ce7fc6a5a9f4714ff564b570c1b833de2d5912fdf5e97663e517ab5"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.340411 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.340446 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.340460 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3ab07856d8f9588074a57091a78a6cce7888cbf29687dc215d1778142f5ed115"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.342159 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" exitCode=0 Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.342214 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.342236 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"1cb4ef7ccc289586aa16f5b531c9a00f2cf64fb13c07df4d007ca3f84648072b"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.345513 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerStarted","Data":"91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.345554 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerStarted","Data":"535ed246fe5452b66d53887b9ef7d7f7f0c00ceaf8b652f2338e15026e41ec7d"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.347802 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.347842 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f2f1a9b6ecfc2a0c0e6b11b1979cb27cf3d44683b32333ad3fb789c30bb6fab2"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.349683 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.350661 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.351782 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25"} Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.352425 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.368758 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.383917 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.398817 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.410090 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.424580 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.434623 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.447795 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.459884 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.473621 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.494647 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.507262 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.518500 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.538160 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.550743 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.566121 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5q8ns"] Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.566591 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.568539 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.569751 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.570046 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.571048 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.571263 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.572510 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.595078 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.608991 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.621698 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.639775 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.655631 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.674379 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.674513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/879c79ed-3fea-4896-84a5-e3c44d13a0c6-rootfs\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.674540 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/879c79ed-3fea-4896-84a5-e3c44d13a0c6-proxy-tls\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.674562 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5bwg\" (UniqueName: \"kubernetes.io/projected/879c79ed-3fea-4896-84a5-e3c44d13a0c6-kube-api-access-n5bwg\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.674606 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/879c79ed-3fea-4896-84a5-e3c44d13a0c6-mcd-auth-proxy-config\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.674728 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:25.674708727 +0000 UTC m=+24.890369677 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.678846 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.692993 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.701846 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.721908 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.732893 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.760529 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775691 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775729 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775751 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/879c79ed-3fea-4896-84a5-e3c44d13a0c6-rootfs\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775765 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/879c79ed-3fea-4896-84a5-e3c44d13a0c6-proxy-tls\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775781 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5bwg\" (UniqueName: \"kubernetes.io/projected/879c79ed-3fea-4896-84a5-e3c44d13a0c6-kube-api-access-n5bwg\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775829 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775845 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/879c79ed-3fea-4896-84a5-e3c44d13a0c6-mcd-auth-proxy-config\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.775861 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.775968 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.775965 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.775983 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.775999 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776038 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:25.776026759 +0000 UTC m=+24.991687709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776055 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:25.77604678 +0000 UTC m=+24.991707730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776122 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776160 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:25.776143942 +0000 UTC m=+24.991804892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.776192 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/879c79ed-3fea-4896-84a5-e3c44d13a0c6-rootfs\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.776513 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/879c79ed-3fea-4896-84a5-e3c44d13a0c6-mcd-auth-proxy-config\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776522 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776540 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776551 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:23 crc kubenswrapper[4759]: E1205 00:23:23.776595 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:25.776582953 +0000 UTC m=+24.992243893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.784499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/879c79ed-3fea-4896-84a5-e3c44d13a0c6-proxy-tls\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.784707 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.789906 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5bwg\" (UniqueName: \"kubernetes.io/projected/879c79ed-3fea-4896-84a5-e3c44d13a0c6-kube-api-access-n5bwg\") pod \"machine-config-daemon-5q8ns\" (UID: \"879c79ed-3fea-4896-84a5-e3c44d13a0c6\") " pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.801040 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.812586 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.827679 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.840467 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.861700 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.873021 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.903507 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:23:23 crc kubenswrapper[4759]: W1205 00:23:23.923217 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879c79ed_3fea_4896_84a5_e3c44d13a0c6.slice/crio-e02206444883eab278a17a81f0a6bc964fa92c06a673f4a6d3047c7c75f4fa98 WatchSource:0}: Error finding container e02206444883eab278a17a81f0a6bc964fa92c06a673f4a6d3047c7c75f4fa98: Status 404 returned error can't find the container with id e02206444883eab278a17a81f0a6bc964fa92c06a673f4a6d3047c7c75f4fa98 Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.947000 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tnqtq"] Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.947360 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.949665 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.949856 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.949874 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.950908 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.963020 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:23 crc kubenswrapper[4759]: I1205 00:23:23.985361 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:23Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.009134 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.021432 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.024352 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.027188 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.032446 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.043658 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.061060 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.078754 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b48z\" (UniqueName: \"kubernetes.io/projected/7e0928c1-8104-4803-bf39-f48da5f1fec2-kube-api-access-8b48z\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.078844 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7e0928c1-8104-4803-bf39-f48da5f1fec2-serviceca\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.078886 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e0928c1-8104-4803-bf39-f48da5f1fec2-host\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.087989 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.132805 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.155460 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:24 crc kubenswrapper[4759]: E1205 00:23:24.155590 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.174486 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.179842 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7e0928c1-8104-4803-bf39-f48da5f1fec2-serviceca\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.179913 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e0928c1-8104-4803-bf39-f48da5f1fec2-host\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.179940 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b48z\" (UniqueName: \"kubernetes.io/projected/7e0928c1-8104-4803-bf39-f48da5f1fec2-kube-api-access-8b48z\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.180192 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e0928c1-8104-4803-bf39-f48da5f1fec2-host\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.180923 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7e0928c1-8104-4803-bf39-f48da5f1fec2-serviceca\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.224008 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b48z\" (UniqueName: \"kubernetes.io/projected/7e0928c1-8104-4803-bf39-f48da5f1fec2-kube-api-access-8b48z\") pod \"node-ca-tnqtq\" (UID: \"7e0928c1-8104-4803-bf39-f48da5f1fec2\") " pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.235792 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.283386 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tnqtq" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.311925 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: W1205 00:23:24.318173 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0928c1_8104_4803_bf39_f48da5f1fec2.slice/crio-fdc6feb33e642c79484b2b1a3bdc3630f1184d2a7ec00399481962199f0a1b0b WatchSource:0}: Error finding container fdc6feb33e642c79484b2b1a3bdc3630f1184d2a7ec00399481962199f0a1b0b: Status 404 returned error can't find the container with id fdc6feb33e642c79484b2b1a3bdc3630f1184d2a7ec00399481962199f0a1b0b Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.339571 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.356858 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.356906 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.356917 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"e02206444883eab278a17a81f0a6bc964fa92c06a673f4a6d3047c7c75f4fa98"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.360820 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361145 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361208 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361222 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361232 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361244 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.361256 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.363870 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tnqtq" event={"ID":"7e0928c1-8104-4803-bf39-f48da5f1fec2","Type":"ContainerStarted","Data":"fdc6feb33e642c79484b2b1a3bdc3630f1184d2a7ec00399481962199f0a1b0b"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.365927 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc" exitCode=0 Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.366003 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc"} Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.392666 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.438236 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.475201 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.514827 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.548083 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.592431 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.628822 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.670182 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.709832 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.746799 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.790515 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.827850 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.869167 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.907180 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.949830 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:24 crc kubenswrapper[4759]: I1205 00:23:24.986759 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:24Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.030352 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.072370 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.108341 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.147045 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.155196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.155196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.155496 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.155437 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.186334 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.227194 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.266897 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.305705 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.371679 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.378359 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerStarted","Data":"371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2"} Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.380413 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tnqtq" event={"ID":"7e0928c1-8104-4803-bf39-f48da5f1fec2","Type":"ContainerStarted","Data":"6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed"} Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.387813 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.427769 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.466322 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.510486 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.548039 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.588168 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.640810 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.670779 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.711435 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.711652 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:29.71163213 +0000 UTC m=+28.927293080 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.743369 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.760706 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.794026 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.813089 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.813150 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.813197 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.813216 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813332 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813348 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813357 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813400 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:29.813387443 +0000 UTC m=+29.029048383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813691 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813731 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:29.813721941 +0000 UTC m=+29.029382891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813777 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813868 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:29.813855674 +0000 UTC m=+29.029516624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813892 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813926 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.813939 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:25 crc kubenswrapper[4759]: E1205 00:23:25.814008 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:29.813988578 +0000 UTC m=+29.029649608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.833653 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.872667 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.925940 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.947482 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:25 crc kubenswrapper[4759]: I1205 00:23:25.990712 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:25Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.034855 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.155562 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:26 crc kubenswrapper[4759]: E1205 00:23:26.155708 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.384558 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2" exitCode=0 Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.384598 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2"} Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.386066 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd"} Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.408007 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.421579 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.433347 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.447041 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.464684 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.478481 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.490601 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.504355 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.516801 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.535755 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.568074 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.580340 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.600878 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.615395 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.628927 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.668377 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.707467 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.792833 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.805860 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.829344 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.866752 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.911971 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.948663 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:26 crc kubenswrapper[4759]: I1205 00:23:26.993965 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:26Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.039898 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.070431 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.109671 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.147843 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.154969 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.155095 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.154980 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.155301 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.391258 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c" exitCode=0 Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.391376 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c"} Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.395387 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.409503 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.423283 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.434910 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.444970 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.455714 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.466540 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.479257 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.489979 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.506718 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.549009 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.588702 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.633242 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.677040 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.712021 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.837805 4759 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.839674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.839725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.839738 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.839842 4759 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.845794 4759 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.846065 4759 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.847041 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.847085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.847101 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.847119 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.847133 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.859268 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.906794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.906837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.906848 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.906863 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.906872 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.917661 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.921068 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.921101 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.921110 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.921124 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.921134 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.932617 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.936279 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.936343 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.936355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.936374 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.936387 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.948196 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.952697 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.952754 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.952772 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.952800 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.952816 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.966379 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:27Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:27 crc kubenswrapper[4759]: E1205 00:23:27.966540 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.968190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.968218 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.968226 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.968237 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:27 crc kubenswrapper[4759]: I1205 00:23:27.968248 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:27Z","lastTransitionTime":"2025-12-05T00:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.072412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.072451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.072462 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.072475 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.072487 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.154704 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:28 crc kubenswrapper[4759]: E1205 00:23:28.155182 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.174631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.174666 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.174677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.174694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.174706 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.276926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.276963 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.276973 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.276988 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.276999 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.381859 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.381894 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.382555 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.382654 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.383241 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.401264 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30" exitCode=0 Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.402096 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.417584 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.436078 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.450846 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.468673 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.482218 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.490450 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.490479 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.490488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.490501 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.490511 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.496782 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.509524 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.522158 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.533808 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.554754 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.573736 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.585754 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.593575 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.593604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.593614 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.593628 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.593639 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.597609 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.605985 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:28Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.696804 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.696849 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.696861 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.696878 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.696891 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.799534 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.799574 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.799583 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.799598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.799607 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.901963 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.902043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.902067 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.902095 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:28 crc kubenswrapper[4759]: I1205 00:23:28.902113 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:28Z","lastTransitionTime":"2025-12-05T00:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.004853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.004894 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.004902 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.004918 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.004931 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.108127 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.108207 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.108221 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.108240 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.108252 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.154898 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.155100 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.155399 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.155720 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.210812 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.211037 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.211101 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.211188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.211244 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.313977 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.314047 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.314069 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.314096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.314113 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.406885 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerStarted","Data":"a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.410184 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.416858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.416895 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.416905 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.416919 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.416930 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.538494 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.538535 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.538548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.538568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.538580 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.652160 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.652221 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.652232 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.652254 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.652270 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.759084 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.759154 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.759167 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.759189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.759202 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.777593 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.777752 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.777731937 +0000 UTC m=+36.993392887 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.867530 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.867562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.867572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.867585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.867594 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.878398 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.878457 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.878487 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.878520 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878557 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878602 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878630 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878638 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878644 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878655 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.878632938 +0000 UTC m=+37.094293888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878677 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878714 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878728 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878684 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.878674139 +0000 UTC m=+37.094335089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878793 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.878778372 +0000 UTC m=+37.094439322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:29 crc kubenswrapper[4759]: E1205 00:23:29.878813 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.878807742 +0000 UTC m=+37.094468692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.969823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.969905 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.969917 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.969938 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:29 crc kubenswrapper[4759]: I1205 00:23:29.969951 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:29Z","lastTransitionTime":"2025-12-05T00:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.073392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.073441 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.073453 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.073474 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.073489 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.155521 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:30 crc kubenswrapper[4759]: E1205 00:23:30.155653 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.176853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.176914 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.176930 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.176949 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.176986 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.279813 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.279868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.279896 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.279920 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.279937 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.455460 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.455506 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.455519 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.455538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.455557 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.478034 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.495962 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.511420 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.526663 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.543658 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.556903 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.557973 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.558021 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.558032 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.558064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.558074 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.569999 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.585132 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.598698 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.613212 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.634750 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.650255 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.661318 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.661368 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.661378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.661403 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.661415 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.665615 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.683599 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.702942 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.719204 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.736696 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.748222 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.761913 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.763817 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.763857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.764145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.764188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.764204 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.776034 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.794224 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.823761 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.840531 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.856440 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.867136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.867180 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.867194 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.867219 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.867233 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.880125 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.892598 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.906464 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.917608 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:30Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.971465 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.971509 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.971521 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.971537 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:30 crc kubenswrapper[4759]: I1205 00:23:30.971550 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:30Z","lastTransitionTime":"2025-12-05T00:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.074631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.074690 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.074702 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.074719 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.074731 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.154763 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.154882 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:31 crc kubenswrapper[4759]: E1205 00:23:31.154930 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:31 crc kubenswrapper[4759]: E1205 00:23:31.155080 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.167551 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177195 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177242 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177271 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177283 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.177962 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.189205 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.205490 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.226561 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.241086 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: E1205 00:23:31.246444 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a56acd_2ed2_498f_bcd1_93cd4ce2a21c.slice/crio-conmon-a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.267011 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.279738 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.279776 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.279787 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.279802 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.279812 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.302945 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.329044 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.342733 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.360804 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.373126 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.381672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.381703 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.381711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.381724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.381733 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.385437 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.395643 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.463164 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de" exitCode=0 Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.463222 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.483745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.483893 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.483972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.484056 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.484152 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.486240 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.508761 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.524757 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.536267 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.550092 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.563262 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.576143 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.588150 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.588404 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.588422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.588442 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.588455 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.599705 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.614614 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.631277 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.642240 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.654228 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.669019 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.679671 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.691349 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.691415 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.691426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.691448 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.691464 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.793994 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.794067 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.794091 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.794123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.794146 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.896412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.896459 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.896482 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.896536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.896554 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.998757 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.998794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.998803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.998819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:31 crc kubenswrapper[4759]: I1205 00:23:31.998829 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:31Z","lastTransitionTime":"2025-12-05T00:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.052727 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.064496 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.075692 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.087406 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.099050 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.100878 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.100903 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.100914 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.100927 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.100936 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.112354 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.124832 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.138412 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.153575 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.154799 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:32 crc kubenswrapper[4759]: E1205 00:23:32.154911 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.169089 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.185810 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.203680 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.203735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.203758 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.203789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.203811 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.209996 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.227827 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.242192 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.252906 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.306681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.306720 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.306730 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.306746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.306757 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.410563 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.410631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.410660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.410688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.410705 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.473501 4759 generic.go:334] "Generic (PLEG): container finished" podID="51a56acd-2ed2-498f-bcd1-93cd4ce2a21c" containerID="295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c" exitCode=0 Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.473610 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerDied","Data":"295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.491917 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.512355 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.513207 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.513246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.513255 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.513269 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.513279 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.527717 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.542581 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.554849 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.566004 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.589460 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.600581 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.610522 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.619538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.619587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.619599 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.619617 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.619659 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.626127 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.644904 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.658849 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.676622 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.687595 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.721600 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.721636 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.721646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.721660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.721671 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.824075 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.824110 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.824120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.824136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.824148 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.926987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.927031 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.927047 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.927064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:32 crc kubenswrapper[4759]: I1205 00:23:32.927076 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:32Z","lastTransitionTime":"2025-12-05T00:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.029245 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.029618 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.029631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.029649 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.029661 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.131553 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.131589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.131597 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.131610 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.131619 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.155148 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.155239 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:33 crc kubenswrapper[4759]: E1205 00:23:33.155283 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:33 crc kubenswrapper[4759]: E1205 00:23:33.155386 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.164686 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.164723 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.164842 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.179581 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.188151 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.196783 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.211633 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.223278 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.233743 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.233789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.233802 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.233823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.233837 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.236970 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.253236 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.276808 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.292798 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.304814 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.320293 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.330689 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.337766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.337813 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.337830 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.337848 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.337860 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.348132 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.366321 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.387653 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.409067 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.425012 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.440405 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.440442 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.440452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.440465 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.440473 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.441060 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.454228 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.469503 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.482682 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.483835 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.484021 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" event={"ID":"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c","Type":"ContainerStarted","Data":"cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.497482 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.508741 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.521743 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.533857 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.542762 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.542803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.542815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.542832 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.542845 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.550888 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.574974 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.591657 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.607864 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.620971 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.636979 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.645235 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.645282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.645294 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.645333 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.645349 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.647774 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.660670 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.677136 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.688510 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.699231 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.711104 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.724238 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.737914 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.748239 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.748275 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.748283 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.748297 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.748341 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.750982 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.765292 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.779483 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.791341 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.804125 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:33Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.853384 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.853426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.853436 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.853451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.853462 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.956575 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.956630 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.956643 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.956661 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:33 crc kubenswrapper[4759]: I1205 00:23:33.956673 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:33Z","lastTransitionTime":"2025-12-05T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.059077 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.059123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.059137 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.059156 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.059168 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.155266 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:34 crc kubenswrapper[4759]: E1205 00:23:34.155402 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.161106 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.161168 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.161183 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.161201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.161213 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.264540 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.264604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.264621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.264646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.264668 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.367438 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.367499 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.367508 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.367522 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.367531 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.469465 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.469532 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.469544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.469584 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.469598 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.489537 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/0.log" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.492550 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859" exitCode=1 Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.492603 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.493818 4759 scope.go:117] "RemoveContainer" containerID="93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.517369 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.542364 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.554074 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.564441 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.572128 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.572179 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.572193 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.572211 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.572224 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.575757 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.588230 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.602399 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.619292 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.632220 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.642578 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.651552 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.665128 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.675493 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.675533 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.675544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.675559 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.675568 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.686835 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.702072 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.777632 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.777666 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.777677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.777689 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.777698 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.880195 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.880238 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.880246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.880260 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.880270 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.891571 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n"] Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.891960 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.894228 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.894379 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.905272 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.919170 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.929314 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2gj\" (UniqueName: \"kubernetes.io/projected/8b4f1c6b-7070-450f-8187-881125eec0d4-kube-api-access-vh2gj\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.929458 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b4f1c6b-7070-450f-8187-881125eec0d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.929515 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.929554 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.931854 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.945063 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.961492 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.972263 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.982916 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.982950 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.982959 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.982973 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.982984 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:34Z","lastTransitionTime":"2025-12-05T00:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.986541 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:34 crc kubenswrapper[4759]: I1205 00:23:34.996193 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:34Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.006704 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.016301 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.028460 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.030908 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b4f1c6b-7070-450f-8187-881125eec0d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.030959 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.030997 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.031023 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2gj\" (UniqueName: \"kubernetes.io/projected/8b4f1c6b-7070-450f-8187-881125eec0d4-kube-api-access-vh2gj\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.031653 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.031810 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b4f1c6b-7070-450f-8187-881125eec0d4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.039692 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b4f1c6b-7070-450f-8187-881125eec0d4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.040597 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.046767 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2gj\" (UniqueName: \"kubernetes.io/projected/8b4f1c6b-7070-450f-8187-881125eec0d4-kube-api-access-vh2gj\") pod \"ovnkube-control-plane-749d76644c-spg8n\" (UID: \"8b4f1c6b-7070-450f-8187-881125eec0d4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.052528 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.063875 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.073065 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.085695 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.085737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.085751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.085774 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.085786 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.155641 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:35 crc kubenswrapper[4759]: E1205 00:23:35.155786 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.156188 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:35 crc kubenswrapper[4759]: E1205 00:23:35.156264 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.188636 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.188695 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.188717 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.188744 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.188763 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.206774 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" Dec 05 00:23:35 crc kubenswrapper[4759]: W1205 00:23:35.226124 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b4f1c6b_7070_450f_8187_881125eec0d4.slice/crio-cb772396287e6130ee79d6b5cfeab4b53564ea3e1fed07cf1c97babf9e14373b WatchSource:0}: Error finding container cb772396287e6130ee79d6b5cfeab4b53564ea3e1fed07cf1c97babf9e14373b: Status 404 returned error can't find the container with id cb772396287e6130ee79d6b5cfeab4b53564ea3e1fed07cf1c97babf9e14373b Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.300811 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.300844 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.300853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.300868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.300880 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.403225 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.403256 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.403268 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.403283 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.403293 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.496679 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" event={"ID":"8b4f1c6b-7070-450f-8187-881125eec0d4","Type":"ContainerStarted","Data":"cb772396287e6130ee79d6b5cfeab4b53564ea3e1fed07cf1c97babf9e14373b"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.498443 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/0.log" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.500890 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.500996 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.505478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.505515 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.505528 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.505541 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.505551 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.514516 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.529718 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.541085 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.554922 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.566886 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.581797 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.600327 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.610748 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.611035 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.611064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.611073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.611088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.611097 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.622745 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.635864 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.647961 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.658740 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.668062 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.676256 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.686000 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:35Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.713791 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.713824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.713835 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.713851 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.713863 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.816417 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.816451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.816462 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.816478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.816490 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.919665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.919718 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.919735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.919757 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:35 crc kubenswrapper[4759]: I1205 00:23:35.919775 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:35Z","lastTransitionTime":"2025-12-05T00:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.022951 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.022985 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.022993 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.023007 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.023016 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.124937 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.124974 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.124982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.124996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.125005 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.155661 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:36 crc kubenswrapper[4759]: E1205 00:23:36.155832 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.228427 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.228473 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.228484 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.228501 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.228512 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.370441 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.370478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.370487 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.370500 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.370510 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.472028 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.472076 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.472087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.472102 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.472112 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.505965 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" event={"ID":"8b4f1c6b-7070-450f-8187-881125eec0d4","Type":"ContainerStarted","Data":"e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.506001 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.506013 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" event={"ID":"8b4f1c6b-7070-450f-8187-881125eec0d4","Type":"ContainerStarted","Data":"ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.520374 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.532017 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.545707 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.564152 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.574293 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.574331 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.574354 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.574370 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.574381 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.577903 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.589860 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.627270 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.639931 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.651105 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.663448 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.674465 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.675955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.675985 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.675993 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.676006 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.676018 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.691818 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.704110 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.746159 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.752324 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ksxg9"] Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.753015 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: E1205 00:23:36.753123 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.769520 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.772177 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfgr\" (UniqueName: \"kubernetes.io/projected/f6ca2f36-241c-41cb-9d1d-d6856e819953-kube-api-access-5rfgr\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.772287 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.778604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.778658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.778671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.778691 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.778703 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.783135 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.795671 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.805623 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.816361 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.826812 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.837841 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.849484 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.860557 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.873300 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.873569 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.873614 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfgr\" (UniqueName: \"kubernetes.io/projected/f6ca2f36-241c-41cb-9d1d-d6856e819953-kube-api-access-5rfgr\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: E1205 00:23:36.873734 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:36 crc kubenswrapper[4759]: E1205 00:23:36.873800 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:37.373782469 +0000 UTC m=+36.589443419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.880908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.880946 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.880956 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.880971 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.880981 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.884116 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.894986 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfgr\" (UniqueName: \"kubernetes.io/projected/f6ca2f36-241c-41cb-9d1d-d6856e819953-kube-api-access-5rfgr\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.896439 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.907960 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.921285 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.940248 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.952830 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.963944 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:36Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.983580 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.983646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.983658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.983672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:36 crc kubenswrapper[4759]: I1205 00:23:36.983681 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:36Z","lastTransitionTime":"2025-12-05T00:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.085607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.085657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.085667 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.085683 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.085693 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.155412 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.155417 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.155617 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.155719 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.188515 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.188587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.188604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.188629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.188650 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.291180 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.291238 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.291254 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.291277 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.291293 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.378195 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.378405 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.378495 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:38.378474131 +0000 UTC m=+37.594135111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.394136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.394209 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.394223 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.394238 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.394272 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.497055 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.497096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.497104 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.497117 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.497128 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.510997 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/1.log" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.511757 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/0.log" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.515121 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1" exitCode=1 Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.515167 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.515259 4759 scope.go:117] "RemoveContainer" containerID="93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.515922 4759 scope.go:117] "RemoveContainer" containerID="61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1" Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.516044 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.536635 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.561155 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.578597 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.592729 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.600272 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.600374 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.600392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.600416 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.600432 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.607571 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.620128 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.633519 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.649535 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.664214 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.677992 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.696484 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.702621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.702708 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.702733 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.702757 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.702776 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.711513 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.727275 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.738799 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.753068 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.766169 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:37Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.784560 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.784707 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:23:53.784687891 +0000 UTC m=+53.000348851 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.804817 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.804847 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.804857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.804876 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.804892 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.885832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.885881 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.885901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.885925 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886024 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886067 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:53.886055223 +0000 UTC m=+53.101716173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886447 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886467 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886470 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886567 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886480 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886581 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:53.886553376 +0000 UTC m=+53.102214316 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886664 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:53.886656789 +0000 UTC m=+53.102317739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886619 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886686 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:37 crc kubenswrapper[4759]: E1205 00:23:37.886715 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:53.88670942 +0000 UTC m=+53.102370370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.907668 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.907750 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.907794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.907822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:37 crc kubenswrapper[4759]: I1205 00:23:37.907840 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:37Z","lastTransitionTime":"2025-12-05T00:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.003978 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.004099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.004121 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.004148 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.004168 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.023850 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:38Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.028526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.028562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.028572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.028588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.028600 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.041391 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:38Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.044553 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.044596 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.044607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.044623 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.044635 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.055822 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:38Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.060538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.060572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.060583 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.060600 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.060612 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.076025 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:38Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.079204 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.079270 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.079296 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.079371 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.079399 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.103044 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:38Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.103191 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.104693 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.104757 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.104778 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.104803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.104825 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.155801 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.155862 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.155985 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.156156 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.207188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.207230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.207244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.207264 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.207280 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.309741 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.310114 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.310282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.310476 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.310503 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.389959 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.390128 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:38 crc kubenswrapper[4759]: E1205 00:23:38.390188 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:40.390173122 +0000 UTC m=+39.605834082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.413085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.413131 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.413198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.413216 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.413227 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.515777 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.515819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.515828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.515845 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.515858 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.618816 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.618861 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.618874 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.618893 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.618905 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.720955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.721026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.721036 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.721052 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.721061 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.823686 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.823753 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.823778 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.823808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.823830 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.926831 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.927199 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.927392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.927543 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:38 crc kubenswrapper[4759]: I1205 00:23:38.927665 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:38Z","lastTransitionTime":"2025-12-05T00:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.031943 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.032003 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.032020 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.032043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.032063 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.134747 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.134814 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.134837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.134866 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.134889 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.155110 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.155202 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:39 crc kubenswrapper[4759]: E1205 00:23:39.155296 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:39 crc kubenswrapper[4759]: E1205 00:23:39.155833 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.237509 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.237578 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.237590 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.237612 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.237627 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.340735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.341118 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.341251 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.341437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.341620 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.445539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.445630 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.445657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.445688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.445711 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.548661 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.548700 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.548711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.548725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.548736 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.650726 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.650773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.650786 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.650803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.650815 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.753031 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.753086 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.753097 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.753117 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.753132 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.856417 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.856515 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.856536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.856570 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.856591 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.959106 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.959146 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.959156 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.959173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:39 crc kubenswrapper[4759]: I1205 00:23:39.959186 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:39Z","lastTransitionTime":"2025-12-05T00:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.062006 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.062036 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.062045 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.062059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.062071 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.155685 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.155691 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:40 crc kubenswrapper[4759]: E1205 00:23:40.156057 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:40 crc kubenswrapper[4759]: E1205 00:23:40.156168 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.165754 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.165828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.165848 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.165876 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.165897 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.271233 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.271276 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.271286 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.271326 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.271339 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.373849 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.373900 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.373917 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.373935 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.373947 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.408679 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:40 crc kubenswrapper[4759]: E1205 00:23:40.408848 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:40 crc kubenswrapper[4759]: E1205 00:23:40.408904 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:44.408888509 +0000 UTC m=+43.624549449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.476118 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.476167 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.476175 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.476189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.476200 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.528840 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/1.log" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.578927 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.578986 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.579003 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.579043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.579061 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.681616 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.681677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.681762 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.681791 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.681810 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.794096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.794125 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.794134 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.794147 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.794156 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.896468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.896510 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.896519 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.896533 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.896543 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.998484 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.998538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.998549 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.998568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:40 crc kubenswrapper[4759]: I1205 00:23:40.998580 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:40Z","lastTransitionTime":"2025-12-05T00:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.100393 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.100457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.100471 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.100494 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.100543 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.154872 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.154977 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:41 crc kubenswrapper[4759]: E1205 00:23:41.155040 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:41 crc kubenswrapper[4759]: E1205 00:23:41.155139 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.171432 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.184466 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.197882 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.211186 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.211239 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.211251 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.211286 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.211299 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.219117 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.238225 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.264826 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93991fa4d82aaa435a26c7f22b384b88219280a96914bd6cb0ce6e544ade0859\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"message\\\":\\\"ing reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 00:23:33.404062 5959 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 00:23:33.404080 5959 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 00:23:33.404105 5959 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 00:23:33.404116 5959 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 00:23:33.404431 5959 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 00:23:33.404455 5959 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1205 00:23:33.404486 5959 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 00:23:33.404493 5959 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 00:23:33.404512 5959 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 00:23:33.404519 5959 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 00:23:33.404528 5959 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 00:23:33.404536 5959 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1205 00:23:33.404546 5959 factory.go:656] Stopping watch factory\\\\nI1205 00:23:33.404552 5959 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 00:23:33.404564 5959 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.279290 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.296954 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.308446 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.313637 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.313671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.313683 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.313698 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.313709 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.322907 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.337423 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.351750 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.365083 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.377931 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.391338 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.404031 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:41Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.416146 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.416198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.416212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.416230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.416241 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.519059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.519111 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.519122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.519145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.519159 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.622505 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.622573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.622585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.622603 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.622616 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.725490 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.725651 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.725671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.725690 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.725705 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.828319 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.828381 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.828392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.828408 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.828418 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.930881 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.930944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.930962 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.930984 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:41 crc kubenswrapper[4759]: I1205 00:23:41.931001 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:41Z","lastTransitionTime":"2025-12-05T00:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.033452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.033492 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.033502 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.033517 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.033529 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.137171 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.137201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.137221 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.137246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.137266 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.155595 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.155678 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:42 crc kubenswrapper[4759]: E1205 00:23:42.155743 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:42 crc kubenswrapper[4759]: E1205 00:23:42.155830 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.239484 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.239513 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.239521 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.239534 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.239542 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.341837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.342076 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.342146 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.342217 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.342284 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.444284 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.444642 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.445088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.445175 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.445238 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.547378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.547417 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.547428 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.547443 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.547451 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.649274 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.649753 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.649815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.649910 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.649972 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.752090 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.752130 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.752139 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.752162 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.752173 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.854285 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.854369 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.854383 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.854399 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.854411 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.956789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.956831 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.956840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.956856 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:42 crc kubenswrapper[4759]: I1205 00:23:42.956865 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:42Z","lastTransitionTime":"2025-12-05T00:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.059639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.059696 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.059706 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.059722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.059733 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.155410 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.155461 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:43 crc kubenswrapper[4759]: E1205 00:23:43.155554 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:43 crc kubenswrapper[4759]: E1205 00:23:43.155698 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.167482 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.167538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.167555 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.167576 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.167591 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.269742 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.269842 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.269850 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.269864 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.269872 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.543715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.543743 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.543750 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.543766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.543774 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.645958 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.646000 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.646008 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.646022 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.646031 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.749167 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.749198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.749207 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.749220 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.749230 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.851890 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.851929 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.851937 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.851952 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.851962 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.954517 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.954571 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.954586 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.954605 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:43 crc kubenswrapper[4759]: I1205 00:23:43.954617 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:43Z","lastTransitionTime":"2025-12-05T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.056511 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.056549 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.056560 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.056574 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.056585 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.155072 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.155156 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:44 crc kubenswrapper[4759]: E1205 00:23:44.155210 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:44 crc kubenswrapper[4759]: E1205 00:23:44.155296 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.158722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.158758 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.158770 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.158795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.158807 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.261777 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.261819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.261833 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.261854 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.261869 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.363848 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.363898 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.363915 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.363936 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.363953 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.452241 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:44 crc kubenswrapper[4759]: E1205 00:23:44.452570 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:44 crc kubenswrapper[4759]: E1205 00:23:44.452704 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:23:52.452673848 +0000 UTC m=+51.668334838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.466826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.466862 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.466873 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.466888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.466898 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.569739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.569802 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.569821 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.569846 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.569867 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.672789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.672858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.672880 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.672911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.672934 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.775365 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.775426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.775438 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.775454 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.775464 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.881520 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.881590 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.881603 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.881629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.881645 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.984101 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.984155 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.984167 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.984185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:44 crc kubenswrapper[4759]: I1205 00:23:44.984196 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:44Z","lastTransitionTime":"2025-12-05T00:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.086897 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.086997 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.087013 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.087066 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.087079 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.155194 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.155207 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:45 crc kubenswrapper[4759]: E1205 00:23:45.155413 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:45 crc kubenswrapper[4759]: E1205 00:23:45.155482 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.191063 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.191119 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.191128 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.191143 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.191153 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.293959 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.294011 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.294022 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.294039 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.294051 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.397624 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.397668 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.397683 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.397705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.397716 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.501434 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.501470 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.501482 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.501498 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.501511 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.603983 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.604023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.604034 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.604049 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.604060 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.705870 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.705928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.705943 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.705960 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.705970 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.807646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.807711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.807724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.807739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.807747 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.909892 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.909967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.909984 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.910008 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:45 crc kubenswrapper[4759]: I1205 00:23:45.910026 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:45Z","lastTransitionTime":"2025-12-05T00:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.012950 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.013003 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.013016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.013041 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.013058 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.115250 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.115326 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.115343 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.115362 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.115384 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.154873 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.154937 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:46 crc kubenswrapper[4759]: E1205 00:23:46.155043 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:46 crc kubenswrapper[4759]: E1205 00:23:46.155336 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.217552 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.217614 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.217650 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.217677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.217697 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.321089 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.321190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.321208 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.321231 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.321247 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.423432 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.423485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.423498 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.423517 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.423528 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.526656 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.526711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.526721 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.526739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.526751 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.628344 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.628396 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.628410 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.628425 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.628436 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.730894 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.730938 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.730947 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.730961 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.730970 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.834395 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.834466 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.834477 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.834498 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.834511 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.938281 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.938412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.938455 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.938485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:46 crc kubenswrapper[4759]: I1205 00:23:46.938508 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:46Z","lastTransitionTime":"2025-12-05T00:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.041600 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.041653 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.041674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.041701 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.041722 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.143839 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.143868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.143877 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.143893 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.143912 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.155436 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:47 crc kubenswrapper[4759]: E1205 00:23:47.155540 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.155633 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:47 crc kubenswrapper[4759]: E1205 00:23:47.155791 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.246548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.246623 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.246639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.246657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.246669 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.349644 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.349685 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.349694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.349709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.349721 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.453145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.453193 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.453203 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.453217 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.453227 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.556392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.556445 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.556456 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.556473 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.556482 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.659782 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.659836 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.659851 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.659877 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.659889 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.761960 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.762012 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.762024 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.762042 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.762054 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.865634 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.865676 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.865686 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.865702 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.865713 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.968364 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.968476 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.968500 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.968532 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:47 crc kubenswrapper[4759]: I1205 00:23:47.968555 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:47Z","lastTransitionTime":"2025-12-05T00:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.011486 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.012900 4759 scope.go:117] "RemoveContainer" containerID="61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.030231 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.045286 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.056459 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071108 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071147 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071158 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071178 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071131 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.071190 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.082551 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.096828 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.128551 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.142429 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.154273 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.155645 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.155732 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.155867 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.155930 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.170586 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.174695 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.174737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.174746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.174760 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.174771 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.183217 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.194802 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.208032 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.218796 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.229277 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.240714 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.277906 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.277970 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.277982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.278000 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.278011 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.376886 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.376908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.376915 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.376928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.376937 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.394281 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.398520 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.398567 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.398576 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.398589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.398598 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.409435 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.412721 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.412748 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.412758 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.412776 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.412787 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.423418 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.427011 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.427073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.427087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.427123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.427135 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.441508 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.445543 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.445582 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.445595 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.445612 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.445623 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.458503 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: E1205 00:23:48.459271 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.460709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.460740 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.460751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.460766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.460777 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.562250 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.562285 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.562294 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.562322 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.562331 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.563099 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/1.log" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.566233 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.566675 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.581849 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.598818 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.614024 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.637821 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.657497 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.665053 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.665093 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.665102 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.665116 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.665126 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.673254 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.687826 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.700993 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.714116 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.726855 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.738386 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.748335 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.761093 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.767431 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.767464 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.767478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.767493 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.767503 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.774367 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.785187 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.794198 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:48Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.869098 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.869136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.869145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.869158 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.869167 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.971801 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.971841 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.971850 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.971865 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:48 crc kubenswrapper[4759]: I1205 00:23:48.971873 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:48Z","lastTransitionTime":"2025-12-05T00:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.074686 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.074722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.074732 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.074748 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.074758 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.154775 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.154900 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:49 crc kubenswrapper[4759]: E1205 00:23:49.154901 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:49 crc kubenswrapper[4759]: E1205 00:23:49.155111 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.177765 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.177812 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.177822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.177840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.177851 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.280855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.280912 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.280928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.280952 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.280968 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.383414 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.383466 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.383481 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.383501 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.383519 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.487237 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.487342 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.487369 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.487398 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.487420 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.589932 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.589978 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.589988 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.589999 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.590007 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.693036 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.693090 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.693107 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.693129 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.693146 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.797148 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.797208 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.797220 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.797238 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.797252 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.901176 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.901234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.901246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.901263 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:49 crc kubenswrapper[4759]: I1205 00:23:49.901287 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:49Z","lastTransitionTime":"2025-12-05T00:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.004373 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.004419 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.004442 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.004461 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.004475 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.106622 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.106679 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.106701 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.106724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.106742 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.154673 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.154722 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:50 crc kubenswrapper[4759]: E1205 00:23:50.154804 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:50 crc kubenswrapper[4759]: E1205 00:23:50.154903 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.209962 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.210029 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.210048 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.210075 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.210095 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.312735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.312796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.312815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.312835 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.312851 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.415384 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.415438 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.415454 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.415478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.415500 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.525487 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.525555 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.525573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.525598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.525615 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.627769 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.627806 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.627823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.627839 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.627848 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.730408 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.730454 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.730464 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.730481 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.730493 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.832583 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.832620 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.832629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.832643 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.832651 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.935642 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.935670 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.935678 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.935691 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:50 crc kubenswrapper[4759]: I1205 00:23:50.935702 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:50Z","lastTransitionTime":"2025-12-05T00:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.039412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.039529 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.039555 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.039587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.039608 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.142529 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.142594 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.142611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.142754 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.142790 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.155402 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.155535 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:51 crc kubenswrapper[4759]: E1205 00:23:51.156533 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:51 crc kubenswrapper[4759]: E1205 00:23:51.156663 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.175848 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.192906 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.208682 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.223289 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.241856 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.245607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.245662 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.245681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.245703 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.245719 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.259846 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.271793 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.285666 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.298733 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.324450 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.342105 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.348153 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.348189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.348198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.348212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.348224 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.361198 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.385403 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.405790 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.420360 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.432989 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.452088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.452162 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.452176 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.452195 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.452548 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.554928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.555001 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.555024 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.555096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.555117 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.583159 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/2.log" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.584133 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/1.log" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.587271 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2" exitCode=1 Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.587399 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.587457 4759 scope.go:117] "RemoveContainer" containerID="61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.589186 4759 scope.go:117] "RemoveContainer" containerID="ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2" Dec 05 00:23:51 crc kubenswrapper[4759]: E1205 00:23:51.589556 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.600814 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.613253 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.626093 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.637571 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.647829 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.657611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.657638 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.657646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.657659 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.657668 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.658102 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.670128 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.678737 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.695050 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.715169 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.747801 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.760241 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.760291 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.760302 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.760352 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.760364 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.773038 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.786734 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.797204 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.811804 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.825827 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.863209 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.863245 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.863256 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.863277 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.863289 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.966747 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.966811 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.966829 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.966858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:51 crc kubenswrapper[4759]: I1205 00:23:51.966880 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:51Z","lastTransitionTime":"2025-12-05T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.070114 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.070256 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.070283 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.070345 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.070367 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.155256 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.155263 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:52 crc kubenswrapper[4759]: E1205 00:23:52.155533 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:52 crc kubenswrapper[4759]: E1205 00:23:52.155641 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.173920 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.173992 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.174009 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.174028 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.174041 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.276478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.276531 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.276539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.276554 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.276565 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.378823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.378873 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.378881 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.378894 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.378904 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.481711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.481762 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.481773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.481790 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.481801 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.548085 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:52 crc kubenswrapper[4759]: E1205 00:23:52.548213 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:52 crc kubenswrapper[4759]: E1205 00:23:52.548274 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:08.548258223 +0000 UTC m=+67.763919183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.585043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.585104 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.585118 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.585134 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.585145 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.591849 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/2.log" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.687287 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.687353 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.687362 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.687377 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.687390 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.790843 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.790951 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.790972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.791095 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.791192 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.893234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.893372 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.893406 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.893437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.893460 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.996287 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.996405 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.996426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.996450 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:52 crc kubenswrapper[4759]: I1205 00:23:52.996468 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:52Z","lastTransitionTime":"2025-12-05T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.099420 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.099484 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.099502 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.099526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.099554 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.155428 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.155466 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.155605 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.155739 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.201816 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.201855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.201874 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.201890 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.201902 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.304402 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.304435 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.304443 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.304457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.304466 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.407226 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.407284 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.407328 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.407352 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.407370 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.510697 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.510767 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.510790 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.510819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.510842 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.613819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.613869 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.613886 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.613908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.613925 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.716130 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.716181 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.716189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.716203 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.716211 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.819835 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.819874 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.819885 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.819904 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.819916 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.862300 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.862482 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:24:25.862461393 +0000 UTC m=+85.078122343 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.922285 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.922335 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.922347 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.922362 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.922371 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:53Z","lastTransitionTime":"2025-12-05T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.963448 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.963501 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.963534 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:53 crc kubenswrapper[4759]: I1205 00:23:53.963585 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963625 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963707 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963758 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:25.963723193 +0000 UTC m=+85.179384183 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963763 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963821 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963822 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:25.963791555 +0000 UTC m=+85.179452555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963843 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963955 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:25.963934208 +0000 UTC m=+85.179595308 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.963757 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.964017 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.964027 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:53 crc kubenswrapper[4759]: E1205 00:23:53.964061 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:25.964048272 +0000 UTC m=+85.179709222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.003154 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.018134 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025214 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025224 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.025414 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.044206 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.056157 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.070685 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.084611 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.098374 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.113356 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.125577 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.127168 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.127196 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.127206 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.127224 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.127235 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.138751 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.151103 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.154796 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.154822 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:54 crc kubenswrapper[4759]: E1205 00:23:54.154907 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:54 crc kubenswrapper[4759]: E1205 00:23:54.154967 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.164299 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.176906 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.193887 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.206399 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.230037 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.230083 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.230095 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.230114 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.230150 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.231188 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.252620 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.332505 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.332559 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.332571 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.332586 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.332596 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.435739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.435803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.435812 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.435830 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.435843 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.538215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.538250 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.538262 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.538277 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.538288 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.640740 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.640771 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.640780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.640795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.640805 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.743255 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.743604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.743671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.743739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.743794 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.846682 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.847080 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.847282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.847563 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.847754 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.950602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.950647 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.950670 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.950695 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:54 crc kubenswrapper[4759]: I1205 00:23:54.950711 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:54Z","lastTransitionTime":"2025-12-05T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.053775 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.054723 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.054972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.055174 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.055438 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.154946 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.155018 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:55 crc kubenswrapper[4759]: E1205 00:23:55.155658 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:55 crc kubenswrapper[4759]: E1205 00:23:55.155864 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.158520 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.158559 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.158572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.158589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.158600 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.261180 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.261219 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.261229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.261245 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.261257 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.364871 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.364976 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.364996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.365021 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.365038 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.468586 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.468653 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.468674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.468701 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.468724 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.571688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.571735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.571749 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.571768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.571783 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.674339 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.674388 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.674403 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.674422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.674434 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.776979 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.777281 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.777417 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.777531 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.777694 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.880378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.880690 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.880803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.880906 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.881003 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.984482 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.984900 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.985123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.985390 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:55 crc kubenswrapper[4759]: I1205 00:23:55.985623 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:55Z","lastTransitionTime":"2025-12-05T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.088947 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.089042 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.089079 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.089120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.089159 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.155753 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:56 crc kubenswrapper[4759]: E1205 00:23:56.156120 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.155770 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:56 crc kubenswrapper[4759]: E1205 00:23:56.156340 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.191939 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.191974 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.191982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.191994 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.192003 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.294205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.294272 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.294288 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.294341 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.294356 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.397911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.397964 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.397990 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.398012 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.398027 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.499954 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.499997 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.500011 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.500029 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.500042 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.603121 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.603632 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.603730 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.603826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.603993 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.706836 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.706913 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.706949 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.706981 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.707004 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.809996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.810058 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.810073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.810089 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.810099 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.913022 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.913067 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.913076 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.913090 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:56 crc kubenswrapper[4759]: I1205 00:23:56.913099 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:56Z","lastTransitionTime":"2025-12-05T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.015225 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.015277 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.015347 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.015390 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.015410 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.118768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.118818 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.118834 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.118858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.118875 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.155676 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.155710 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:57 crc kubenswrapper[4759]: E1205 00:23:57.155840 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:57 crc kubenswrapper[4759]: E1205 00:23:57.156015 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.220982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.221033 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.221044 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.221060 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.221071 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.323248 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.323294 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.323326 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.323343 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.323354 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.425513 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.425589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.425614 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.425638 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.425655 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.528356 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.528426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.528444 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.528471 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.528493 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.634685 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.634756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.634768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.634789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.634800 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.737493 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.737550 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.737563 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.737585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.737600 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.841382 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.841426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.841437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.841464 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.841476 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.944214 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.944301 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.944331 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.944350 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:57 crc kubenswrapper[4759]: I1205 00:23:57.944362 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:57Z","lastTransitionTime":"2025-12-05T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.047096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.047145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.047155 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.047173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.047191 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.149716 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.149764 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.149775 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.149793 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.149805 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.155393 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.155528 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.155394 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.155619 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.252892 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.252933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.252958 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.252971 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.252980 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.356092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.356175 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.356184 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.356201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.356211 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.458540 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.458577 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.458587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.458610 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.458620 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.505781 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.505834 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.505843 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.505858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.505871 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.526008 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.535526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.535621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.535630 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.535657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.535667 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.554057 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.559025 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.559064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.559077 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.559095 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.559108 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.572088 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.576835 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.576861 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.576870 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.576882 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.576892 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.593373 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.596752 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.596783 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.596795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.596807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.596816 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.607769 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 05 00:23:58 crc kubenswrapper[4759]: E1205 00:23:58.607931 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.609429 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.609457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.609468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.609485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.609497 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.711925 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.711994 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.712019 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.712054 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.712077 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.815552 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.815612 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.815632 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.815667 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.815684 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.919263 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.919374 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.919393 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.919416 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:58 crc kubenswrapper[4759]: I1205 00:23:58.919433 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:58Z","lastTransitionTime":"2025-12-05T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.022765 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.022826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.022837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.022854 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.022867 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.126017 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.126078 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.126095 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.126120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.126138 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.155403 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.155545 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:23:59 crc kubenswrapper[4759]: E1205 00:23:59.155695 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:23:59 crc kubenswrapper[4759]: E1205 00:23:59.155846 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.228715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.228773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.228784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.228803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.228815 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.331888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.331967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.331992 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.332023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.332045 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.435661 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.435712 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.435725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.435742 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.435754 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.539125 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.539194 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.539218 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.539250 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.539271 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.642073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.642129 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.642150 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.642178 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.642196 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.744544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.744579 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.744587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.744599 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.744611 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.847480 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.847548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.847566 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.847594 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.847619 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.950880 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.950948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.950970 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.951000 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:23:59 crc kubenswrapper[4759]: I1205 00:23:59.951023 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:23:59Z","lastTransitionTime":"2025-12-05T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.053711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.053759 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.053780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.053807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.053835 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.154887 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.154911 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:00 crc kubenswrapper[4759]: E1205 00:24:00.155070 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:00 crc kubenswrapper[4759]: E1205 00:24:00.155234 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.158510 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.158577 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.158607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.158776 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.158795 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.264104 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.264187 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.264212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.264244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.264268 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.368058 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.368103 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.368113 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.368127 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.368137 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.469995 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.470034 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.470046 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.470065 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.470076 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.572044 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.572109 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.572126 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.572161 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.572195 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.674381 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.674451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.674472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.674499 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.674520 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.777467 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.777529 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.777543 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.777567 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.777582 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.880183 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.880255 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.880274 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.880301 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.880350 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.982745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.982783 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.982794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.982809 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:00 crc kubenswrapper[4759]: I1205 00:24:00.982817 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:00Z","lastTransitionTime":"2025-12-05T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.085522 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.085573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.085585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.085603 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.085613 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.155661 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.155728 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:01 crc kubenswrapper[4759]: E1205 00:24:01.155807 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:01 crc kubenswrapper[4759]: E1205 00:24:01.155895 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.170321 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.182138 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.187688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.187727 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.187736 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.187751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.187760 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.192482 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.205828 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.220539 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.235978 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.248619 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.261978 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.273138 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.284680 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.290201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.290238 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.290248 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.290263 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.290274 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.297913 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.320177 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61980678e41ca850d0d01ef42dfdf2c66b5a94d8868d1b903a41d0ac2acccbc1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 00:23:36.662410 6167 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1205 00:23:36.662441 6167 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1205 00:23:36.662460 6167 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1205 00:23:36.662505 6167 factory.go:1336] Added *v1.Node event handler 7\\\\nI1205 00:23:36.662540 6167 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1205 00:23:36.662833 6167 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1205 00:23:36.662931 6167 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1205 00:23:36.662972 6167 ovnkube.go:599] Stopped ovnkube\\\\nI1205 00:23:36.662998 6167 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1205 00:23:36.663068 6167 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.335966 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.350025 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.363530 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.376000 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.391957 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.391999 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.392011 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.392030 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.392042 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.393723 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.495279 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.495436 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.495495 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.495519 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.495537 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.598669 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.598977 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.599085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.599206 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.599334 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.701724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.701768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.701799 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.701817 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.701827 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.805062 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.805104 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.805113 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.805142 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.805151 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.909371 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.909443 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.909469 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.909499 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:01 crc kubenswrapper[4759]: I1205 00:24:01.909521 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:01Z","lastTransitionTime":"2025-12-05T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.011911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.011955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.011971 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.011989 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.012002 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.115276 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.115383 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.115406 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.115431 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.115448 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.155014 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.155093 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:02 crc kubenswrapper[4759]: E1205 00:24:02.155217 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:02 crc kubenswrapper[4759]: E1205 00:24:02.155271 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.156480 4759 scope.go:117] "RemoveContainer" containerID="ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2" Dec 05 00:24:02 crc kubenswrapper[4759]: E1205 00:24:02.156958 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.172367 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.190229 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.199569 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.211754 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.217784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.217815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.217826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.217838 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.217847 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.220935 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.233818 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.244478 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.255537 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.269218 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.280675 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.291866 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.305926 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.320233 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.320280 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.320291 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.320325 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.320339 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.324005 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.337033 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.349998 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.363231 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.372750 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.422598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.422641 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.422650 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.422662 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.422670 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.525777 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.525838 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.525857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.525878 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.525895 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.628656 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.628711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.628725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.628746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.628760 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.731460 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.731548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.731564 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.731589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.731610 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.834068 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.834121 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.834133 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.834149 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.834160 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.937340 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.937398 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.937409 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.937426 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:02 crc kubenswrapper[4759]: I1205 00:24:02.937436 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:02Z","lastTransitionTime":"2025-12-05T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.039491 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.039546 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.039562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.039586 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.039602 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.142375 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.142447 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.142469 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.142499 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.142521 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.155776 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:03 crc kubenswrapper[4759]: E1205 00:24:03.155891 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.155926 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:03 crc kubenswrapper[4759]: E1205 00:24:03.156035 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.246726 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.246768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.246780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.246795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.246805 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.349872 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.349951 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.349976 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.350009 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.350035 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.453378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.453452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.453476 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.453516 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.453540 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.555774 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.555838 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.555855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.555878 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.555896 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.660916 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.660969 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.660979 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.660996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.661008 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.763967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.764468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.764727 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.765043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.765664 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.868239 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.868478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.868543 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.868629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.868688 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.994857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.994908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.994924 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.994944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:03 crc kubenswrapper[4759]: I1205 00:24:03.994958 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:03Z","lastTransitionTime":"2025-12-05T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.097923 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.097999 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.098023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.098056 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.098078 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.155666 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.155673 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:04 crc kubenswrapper[4759]: E1205 00:24:04.155911 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:04 crc kubenswrapper[4759]: E1205 00:24:04.156003 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.203697 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.203759 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.203785 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.203815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.203830 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.306475 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.306503 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.306510 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.306524 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.306532 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.408177 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.408203 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.408214 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.408231 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.408243 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.510348 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.510376 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.510385 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.510397 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.510406 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.612379 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.612419 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.612436 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.612452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.612463 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.715078 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.715122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.715133 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.715151 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.715163 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.818537 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.818578 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.818587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.818603 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.818612 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.920846 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.920890 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.920907 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.920931 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:04 crc kubenswrapper[4759]: I1205 00:24:04.920951 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:04Z","lastTransitionTime":"2025-12-05T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.023681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.023715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.023726 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.023740 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.023753 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.126694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.126745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.126761 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.126780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.126794 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.155196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.155257 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:05 crc kubenswrapper[4759]: E1205 00:24:05.155368 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:05 crc kubenswrapper[4759]: E1205 00:24:05.155449 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.229102 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.229132 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.229142 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.229157 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.229168 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.330860 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.330902 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.330913 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.330930 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.330941 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.433398 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.433447 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.433459 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.433478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.433490 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.535599 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.535656 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.535671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.535694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.535709 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.637886 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.637939 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.637947 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.637961 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.637972 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.740211 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.740254 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.740264 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.740280 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.740291 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.842735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.843041 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.843071 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.843088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.843100 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.945903 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.945944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.945953 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.945969 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:05 crc kubenswrapper[4759]: I1205 00:24:05.945977 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:05Z","lastTransitionTime":"2025-12-05T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.049135 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.049188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.049201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.049218 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.049231 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.151968 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.152018 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.152026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.152040 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.152054 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.155481 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.155507 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:06 crc kubenswrapper[4759]: E1205 00:24:06.155573 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:06 crc kubenswrapper[4759]: E1205 00:24:06.155855 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.255174 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.255206 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.255215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.255228 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.255237 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.358567 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.358622 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.358635 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.358657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.358671 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.461662 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.461722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.461735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.461757 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.461774 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.565516 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.565588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.565602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.565627 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.565646 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.668535 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.668598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.668607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.668631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.668644 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.770933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.771396 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.771550 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.771696 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.771843 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.874674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.875150 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.875397 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.875595 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.875814 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.978703 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.978745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.978756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.978772 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:06 crc kubenswrapper[4759]: I1205 00:24:06.978782 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:06Z","lastTransitionTime":"2025-12-05T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.082235 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.082329 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.082348 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.082369 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.082385 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.155571 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.155601 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:07 crc kubenswrapper[4759]: E1205 00:24:07.155753 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:07 crc kubenswrapper[4759]: E1205 00:24:07.155867 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.184379 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.184441 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.184463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.184489 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.184507 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.287431 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.287471 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.287480 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.287497 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.287514 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.390687 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.390753 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.390765 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.390784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.390801 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.493781 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.493836 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.493849 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.493871 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.493887 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.596891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.596967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.596984 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.597009 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.597027 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.700282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.700368 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.700422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.700450 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.700465 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.802724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.802764 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.802773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.802786 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.802795 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.904840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.904872 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.904898 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.904911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:07 crc kubenswrapper[4759]: I1205 00:24:07.904920 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:07Z","lastTransitionTime":"2025-12-05T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.007676 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.007705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.007713 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.007725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.007735 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.109756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.109820 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.109829 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.109841 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.109850 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.155273 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.155337 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.155425 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.155485 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.212774 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.212828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.212844 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.212868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.212884 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.316075 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.316139 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.316151 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.316168 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.316199 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.419637 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.419712 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.419726 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.419770 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.419783 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.522681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.522747 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.522761 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.522782 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.522795 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.625999 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.626063 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.626074 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.626089 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.626099 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.647761 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.647898 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.647967 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:24:40.647949024 +0000 UTC m=+99.863609974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.729172 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.729210 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.729221 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.729235 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.729246 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.831255 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.831322 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.831334 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.831351 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.831363 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.850919 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.850982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.850993 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.851009 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.851046 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.869631 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:08Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.874882 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.874943 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.874957 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.874981 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.874996 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.891656 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:08Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.895834 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.895876 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.895888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.895908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.895919 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.907848 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:08Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.911760 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.911808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.911822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.911843 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.911856 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.925571 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:08Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.929205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.929230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.929242 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.929259 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.929271 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.939951 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:08Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:08 crc kubenswrapper[4759]: E1205 00:24:08.940101 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.941609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.941628 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.941638 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.941650 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:08 crc kubenswrapper[4759]: I1205 00:24:08.941661 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:08Z","lastTransitionTime":"2025-12-05T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.044867 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.044919 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.044934 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.044956 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.044968 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.148180 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.148485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.148569 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.148650 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.148719 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.154665 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.154673 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:09 crc kubenswrapper[4759]: E1205 00:24:09.154834 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:09 crc kubenswrapper[4759]: E1205 00:24:09.154912 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.250706 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.250743 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.250752 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.250766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.250775 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.353411 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.353785 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.353936 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.354090 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.354219 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.457933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.457988 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.458005 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.458029 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.458046 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.560739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.560819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.560845 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.560877 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.560898 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.663222 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.663289 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.663331 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.663350 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.663363 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.765369 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.765413 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.765425 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.765445 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.765458 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.868023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.868344 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.868474 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.868581 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.868678 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.972475 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.972940 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.973222 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.973549 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:09 crc kubenswrapper[4759]: I1205 00:24:09.974079 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:09Z","lastTransitionTime":"2025-12-05T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.076357 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.076388 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.076397 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.076412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.076420 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.154755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:10 crc kubenswrapper[4759]: E1205 00:24:10.154970 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.155687 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:10 crc kubenswrapper[4759]: E1205 00:24:10.155847 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.179088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.179152 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.179173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.179195 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.179207 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.282222 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.282266 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.282277 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.282293 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.282323 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.385192 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.385258 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.385282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.385355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.385381 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.487903 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.487948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.487960 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.487978 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.487990 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.591202 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.591281 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.591354 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.591405 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.591434 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.657450 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/0.log" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.657510 4759 generic.go:334] "Generic (PLEG): container finished" podID="b33957c4-8ef0-4b57-8e3c-183091f3b022" containerID="91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d" exitCode=1 Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.657545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerDied","Data":"91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.657952 4759 scope.go:117] "RemoveContainer" containerID="91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.672290 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.687435 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.693370 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.693433 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.693444 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.693457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.693500 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.697777 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.712289 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.723502 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.737037 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.751053 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.760442 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.770743 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.779771 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.791148 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.795392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.795414 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.795422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.795435 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.795443 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.803349 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.820528 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.838658 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.851149 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.864515 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.875604 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.897016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.897045 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.897057 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.897072 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.897083 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.999527 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.999709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.999805 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:10 crc kubenswrapper[4759]: I1205 00:24:10.999911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.000011 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:10Z","lastTransitionTime":"2025-12-05T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.102665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.102712 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.102728 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.102750 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.102764 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.155011 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.155024 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:11 crc kubenswrapper[4759]: E1205 00:24:11.155154 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:11 crc kubenswrapper[4759]: E1205 00:24:11.155319 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.170790 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.187905 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.197018 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.205634 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.205697 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.205709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.205728 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.205740 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.209504 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.224423 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.235948 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.247355 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.261924 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.270928 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.280865 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.291374 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.307808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.307832 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.307840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.307851 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.307862 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.309201 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.318628 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.327164 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.338041 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.348793 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.366672 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.410545 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.410598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.410611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.410628 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.410640 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.512629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.513670 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.513791 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.513898 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.513984 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.616334 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.616514 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.616646 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.616744 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.617111 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.662361 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/0.log" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.662406 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerStarted","Data":"4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.675945 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.688142 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.701665 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.712967 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.719715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.719767 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.719784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.719808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.719824 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.729650 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.743850 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.757356 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.769163 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.793831 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.807011 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.817855 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.822049 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.822082 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.822092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.822107 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.822118 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.831718 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.844347 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.859375 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.872821 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.883884 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.893378 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.924228 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.924387 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.924464 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.924545 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:11 crc kubenswrapper[4759]: I1205 00:24:11.924613 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:11Z","lastTransitionTime":"2025-12-05T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.027000 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.027291 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.027377 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.027452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.027524 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.130879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.131161 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.131252 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.131361 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.131466 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.155127 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:12 crc kubenswrapper[4759]: E1205 00:24:12.155385 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.155519 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:12 crc kubenswrapper[4759]: E1205 00:24:12.155744 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.234229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.234270 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.234282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.234296 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.234325 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.336681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.337023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.337115 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.337205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.337292 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.440934 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.441364 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.441523 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.441659 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.441789 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.544620 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.544677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.544693 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.544711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.544725 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.647200 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.647250 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.647266 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.647285 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.647296 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.750152 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.750200 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.750212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.750231 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.750244 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.853776 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.853815 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.853826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.853842 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.853854 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.956498 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.956858 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.956992 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.957112 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:12 crc kubenswrapper[4759]: I1205 00:24:12.957237 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:12Z","lastTransitionTime":"2025-12-05T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.059705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.059807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.059837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.059869 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.059889 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.155582 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:13 crc kubenswrapper[4759]: E1205 00:24:13.155721 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.155852 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:13 crc kubenswrapper[4759]: E1205 00:24:13.155989 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.161215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.161253 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.161266 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.161287 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.161354 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.263205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.263236 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.263247 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.263262 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.263272 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.365950 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.365999 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.366016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.366037 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.366052 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.468016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.468062 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.468077 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.468096 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.468117 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.571366 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.571411 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.571422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.571439 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.571450 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.673879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.673917 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.673926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.673942 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.673952 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.777085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.777133 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.777145 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.777166 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.777179 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.880631 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.880703 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.880715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.880732 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.880741 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.983798 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.983860 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.983871 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.983888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:13 crc kubenswrapper[4759]: I1205 00:24:13.983898 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:13Z","lastTransitionTime":"2025-12-05T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.086807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.086879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.086892 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.086911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.086923 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.154912 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.154936 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:14 crc kubenswrapper[4759]: E1205 00:24:14.155080 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:14 crc kubenswrapper[4759]: E1205 00:24:14.155246 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.189933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.189978 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.189987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.190002 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.190013 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.292602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.292639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.292647 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.292660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.292669 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.395082 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.395133 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.395148 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.395166 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.395177 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.498711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.498785 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.498807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.498836 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.498859 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.602172 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.602212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.602225 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.602242 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.602254 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.705408 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.705460 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.705470 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.705488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.705501 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.809190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.809242 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.809254 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.809272 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.809285 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.912735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.912800 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.912821 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.912847 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:14 crc kubenswrapper[4759]: I1205 00:24:14.912864 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:14Z","lastTransitionTime":"2025-12-05T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.016485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.016553 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.016575 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.016603 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.016625 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.120301 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.120430 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.120456 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.120488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.120514 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.155081 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:15 crc kubenswrapper[4759]: E1205 00:24:15.155341 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.155565 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:15 crc kubenswrapper[4759]: E1205 00:24:15.155683 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.223958 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.224025 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.224038 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.224057 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.224069 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.327405 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.327474 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.327498 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.327526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.327548 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.430810 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.430882 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.430916 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.430954 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.430978 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.533820 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.533892 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.533914 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.533941 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.533962 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.637533 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.637579 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.637611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.637629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.637639 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.740459 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.740536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.740548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.740590 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.740605 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.843775 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.843850 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.843875 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.843903 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.843922 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.946998 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.947072 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.947092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.947120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:15 crc kubenswrapper[4759]: I1205 00:24:15.947141 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:15Z","lastTransitionTime":"2025-12-05T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.051908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.052023 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.052045 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.052072 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.052090 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.154951 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:16 crc kubenswrapper[4759]: E1205 00:24:16.155341 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.155283 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:16 crc kubenswrapper[4759]: E1205 00:24:16.155552 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.156772 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.156857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.156883 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.157434 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.157707 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.261428 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.261483 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.261495 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.261513 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.261527 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.364745 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.364828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.364844 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.364872 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.364885 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.468203 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.468245 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.468259 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.468276 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.468289 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.572275 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.572379 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.572400 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.572457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.572480 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.675793 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.675860 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.675876 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.675899 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.675915 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.779972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.780035 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.780053 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.780081 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.780104 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.882923 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.882996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.883007 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.883047 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.883060 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.985602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.985663 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.985682 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.985712 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:16 crc kubenswrapper[4759]: I1205 00:24:16.985738 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:16Z","lastTransitionTime":"2025-12-05T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.089237 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.089286 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.089348 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.089373 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.089385 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.154795 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.154833 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:17 crc kubenswrapper[4759]: E1205 00:24:17.154997 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:17 crc kubenswrapper[4759]: E1205 00:24:17.155377 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.156463 4759 scope.go:117] "RemoveContainer" containerID="ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.192527 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.192565 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.192574 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.192588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.192598 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.295567 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.295626 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.295637 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.295658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.295670 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.399130 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.399188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.399198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.399227 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.399242 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.502392 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.502442 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.502458 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.502478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.502491 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.605281 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.605328 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.605336 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.605349 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.605358 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.712536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.712578 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.712588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.712609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.712620 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.814748 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.814782 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.814793 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.814809 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.814819 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.917242 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.917271 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.917280 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.917295 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:17 crc kubenswrapper[4759]: I1205 00:24:17.917322 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:17Z","lastTransitionTime":"2025-12-05T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.019623 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.019686 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.019696 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.019711 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.019724 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.122158 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.122193 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.122201 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.122215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.122223 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.154900 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.154913 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:18 crc kubenswrapper[4759]: E1205 00:24:18.155177 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:18 crc kubenswrapper[4759]: E1205 00:24:18.155326 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.167646 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.224162 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.224195 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.224205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.224217 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.224227 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.327570 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.327649 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.327673 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.327705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.327728 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.430129 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.430164 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.430173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.430186 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.430196 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.532824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.532857 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.532869 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.532887 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.532899 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.635459 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.635512 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.635525 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.635542 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.635555 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.693168 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/2.log" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.696545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.712915 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.728180 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.738872 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.738920 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.738931 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.738952 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.738964 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.744218 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.756580 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.766083 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.780871 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.793395 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.806693 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.822901 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.842031 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.842083 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.842099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.842122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.842137 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.844351 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.856706 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.869043 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.888621 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb12a29-4683-4427-bce3-b0729c76a7c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.904352 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.916777 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.927616 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.938945 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.944825 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.944859 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.944867 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.944879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.944888 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.950038 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.950143 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.950152 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.950171 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.950181 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.955095 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: E1205 00:24:18.965578 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.969665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.969741 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.969772 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.969790 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.969799 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: E1205 00:24:18.981702 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.985298 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.985366 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.985378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.985394 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.985406 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:18 crc kubenswrapper[4759]: E1205 00:24:18.995490 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.998182 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.998221 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.998229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.998244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:18 crc kubenswrapper[4759]: I1205 00:24:18.998254 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:18Z","lastTransitionTime":"2025-12-05T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.009186 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.012472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.012524 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.012541 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.012558 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.012571 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.025837 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.025998 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.047037 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.047087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.047104 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.047122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.047132 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.149128 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.149192 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.149213 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.149244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.149268 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.155422 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.155540 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.155668 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.155814 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.252370 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.252414 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.252424 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.252439 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.252449 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.355689 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.355739 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.355751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.355768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.355781 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.460111 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.460172 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.460185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.460207 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.460221 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.562602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.562665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.562681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.562705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.562724 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.664851 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.664909 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.664926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.664948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.664962 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.701260 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/3.log" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.702260 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/2.log" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.705105 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" exitCode=1 Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.705153 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.705191 4759 scope.go:117] "RemoveContainer" containerID="ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.706510 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:24:19 crc kubenswrapper[4759]: E1205 00:24:19.706847 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.737872 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb12a29-4683-4427-bce3-b0729c76a7c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.756742 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.766933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.767006 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.767021 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.767039 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.767053 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.775515 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.787882 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.807888 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\".go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 00:24:18.660518 6661 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mbhwx\\\\nI1205 00:24:18.660483 6661 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-5q8ns after 0 failed attempt(s)\\\\nI1205 00:24:18.660537 6661 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-5q8ns\\\\nI1205 00:24:18.660454 6661 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-wdk4j\\\\nI1205 00:24:18.660559 6661 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-wdk4j in node crc\\\\nF1205 00:24:18.660572 6661 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling we\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.820513 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.831826 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.848621 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.860402 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.869388 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.869436 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.869448 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.869465 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.869479 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.872464 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.889516 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.901915 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.917022 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.930026 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.942689 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.955775 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.969612 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.971193 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.971231 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.971244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.971260 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.971272 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:19Z","lastTransitionTime":"2025-12-05T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:19 crc kubenswrapper[4759]: I1205 00:24:19.979124 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.073939 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.073988 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.074001 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.074020 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.074031 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.154685 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.154737 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:20 crc kubenswrapper[4759]: E1205 00:24:20.154815 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:20 crc kubenswrapper[4759]: E1205 00:24:20.154900 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.176386 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.176462 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.176485 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.176512 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.176534 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.279365 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.279400 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.279409 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.279422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.279431 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.381747 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.381789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.381801 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.381818 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.381829 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.484897 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.484936 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.484945 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.484960 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.484969 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.588796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.588868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.588888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.588918 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.588974 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.692140 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.692255 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.692278 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.692339 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.692363 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.711398 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/3.log" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.795665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.795735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.795777 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.795807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.795827 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.897972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.898004 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.898013 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.898026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:20 crc kubenswrapper[4759]: I1205 00:24:20.898037 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:20Z","lastTransitionTime":"2025-12-05T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.000925 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.000965 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.000979 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.001000 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.001015 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.103264 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.103321 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.103333 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.103350 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.103361 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.155136 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.155136 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:21 crc kubenswrapper[4759]: E1205 00:24:21.155490 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:21 crc kubenswrapper[4759]: E1205 00:24:21.155334 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.176832 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.190386 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.226746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.226798 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.226811 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.226829 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.226840 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.238164 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb12a29-4683-4427-bce3-b0729c76a7c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.262255 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.272935 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.295561 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.324131 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\".go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 00:24:18.660518 6661 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mbhwx\\\\nI1205 00:24:18.660483 6661 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-5q8ns after 0 failed attempt(s)\\\\nI1205 00:24:18.660537 6661 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-5q8ns\\\\nI1205 00:24:18.660454 6661 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-wdk4j\\\\nI1205 00:24:18.660559 6661 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-wdk4j in node crc\\\\nF1205 00:24:18.660572 6661 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling we\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.328722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.328751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.328759 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.328771 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.328780 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.337841 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.350626 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.360936 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.373259 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.388631 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.399807 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.413581 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.427782 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.431573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.431611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.431621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.431636 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.431649 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.440819 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.454274 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.466125 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.534176 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.534211 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.534220 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.534234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.534243 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.636911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.636964 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.636975 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.636992 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.637004 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.739197 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.739260 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.739271 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.739291 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.739341 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.841026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.841062 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.841070 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.841083 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.841092 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.945092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.945128 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.945138 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.945153 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:21 crc kubenswrapper[4759]: I1205 00:24:21.945162 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:21Z","lastTransitionTime":"2025-12-05T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.049416 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.049454 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.049462 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.049477 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.049487 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.152002 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.152044 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.152053 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.152085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.152095 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.155251 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.155359 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:22 crc kubenswrapper[4759]: E1205 00:24:22.155506 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:22 crc kubenswrapper[4759]: E1205 00:24:22.155767 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.254729 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.254778 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.254791 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.254808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.254819 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.357282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.357339 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.357353 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.357367 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.357376 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.459943 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.459974 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.459982 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.459994 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.460003 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.562946 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.562984 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.562994 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.563014 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.563025 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.665185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.665234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.665244 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.665262 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.665275 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.767605 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.767649 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.767657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.767673 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.767683 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.869907 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.869941 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.869950 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.869963 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.869971 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.972743 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.972795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.972805 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.972819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:22 crc kubenswrapper[4759]: I1205 00:24:22.972828 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:22Z","lastTransitionTime":"2025-12-05T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.075553 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.075615 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.075627 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.075639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.075650 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.155375 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:23 crc kubenswrapper[4759]: E1205 00:24:23.155494 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.155553 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:23 crc kubenswrapper[4759]: E1205 00:24:23.155682 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.178147 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.178186 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.178196 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.178211 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.178221 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.280604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.280678 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.280694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.280717 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.280733 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.382795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.382850 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.382859 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.382873 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.382882 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.485534 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.485570 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.485580 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.485595 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.485604 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.587640 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.588333 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.588355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.588375 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.588388 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.691265 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.691333 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.691345 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.691363 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.691376 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.794483 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.794562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.794587 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.794616 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.794640 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.897653 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.898040 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.898059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.898077 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:23 crc kubenswrapper[4759]: I1205 00:24:23.898090 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:23Z","lastTransitionTime":"2025-12-05T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.001685 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.001763 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.001794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.001827 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.001846 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.103944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.104018 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.104042 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.104073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.104098 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.154801 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.154900 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:24 crc kubenswrapper[4759]: E1205 00:24:24.155171 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:24 crc kubenswrapper[4759]: E1205 00:24:24.155400 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.169566 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.206820 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.206891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.206913 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.206947 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.206970 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.310383 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.310423 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.310434 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.310448 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.310458 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.412856 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.412896 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.412930 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.412944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.412953 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.515502 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.515544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.515554 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.515569 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.515581 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.618235 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.618274 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.618283 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.618296 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.618325 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.720539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.720588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.720599 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.720616 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.720627 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.823009 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.823107 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.823124 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.823146 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.823172 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.925900 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.925966 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.925988 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.926016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:24 crc kubenswrapper[4759]: I1205 00:24:24.926038 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:24Z","lastTransitionTime":"2025-12-05T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.028855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.028898 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.028911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.028926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.028936 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.132142 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.132189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.132204 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.132220 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.132230 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.155603 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.155602 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:25 crc kubenswrapper[4759]: E1205 00:24:25.155791 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:25 crc kubenswrapper[4759]: E1205 00:24:25.155982 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.234693 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.234765 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.234791 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.234820 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.234841 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.337562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.337785 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.337799 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.337824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.337833 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.441291 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.441413 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.441441 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.441468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.441488 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.544166 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.544216 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.544230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.544246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.544257 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.646372 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.646414 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.646424 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.646439 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.646448 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.748563 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.748613 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.748625 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.748642 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.748653 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.852018 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.852059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.852069 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.852085 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.852095 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.927943 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:24:25 crc kubenswrapper[4759]: E1205 00:24:25.928149 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.928121602 +0000 UTC m=+149.143782552 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.955588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.955660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.955680 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.955708 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:25 crc kubenswrapper[4759]: I1205 00:24:25.955728 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:25Z","lastTransitionTime":"2025-12-05T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.029622 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.029682 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.029712 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.029744 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029815 4759 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029836 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029879 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029896 4759 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029861 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.02984825 +0000 UTC m=+149.245509200 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.029943 4759 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030002 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.029960872 +0000 UTC m=+149.245621872 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030033 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.030019754 +0000 UTC m=+149.245680834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030073 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030091 4759 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030103 4759 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.030173 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.030163418 +0000 UTC m=+149.245824378 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.058769 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.058823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.058839 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.058862 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.058877 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.154754 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.154777 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.154900 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:26 crc kubenswrapper[4759]: E1205 00:24:26.154953 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.161233 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.161261 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.161270 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.161282 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.161294 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.264173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.264234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.264246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.264263 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.264276 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.366984 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.367027 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.367059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.367074 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.367087 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.470080 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.470113 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.470121 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.470134 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.470147 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.572466 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.572511 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.572523 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.572539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.572551 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.674709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.674758 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.674766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.674780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.674791 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.777879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.777918 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.777927 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.777942 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.777951 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.880961 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.881020 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.881031 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.881067 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.881092 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.984066 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.984103 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.984114 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.984129 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:26 crc kubenswrapper[4759]: I1205 00:24:26.984140 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:26Z","lastTransitionTime":"2025-12-05T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.086985 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.087035 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.087108 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.087131 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.087146 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.154830 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.155040 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:27 crc kubenswrapper[4759]: E1205 00:24:27.155180 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:27 crc kubenswrapper[4759]: E1205 00:24:27.155255 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.189134 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.189174 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.189185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.189202 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.189218 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.291978 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.292039 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.292059 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.292082 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.292101 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.394626 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.394677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.394690 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.394707 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.394719 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.497156 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.497204 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.497215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.497229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.497239 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.599913 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.599968 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.599979 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.599996 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.600007 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.702920 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.702981 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.702998 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.703024 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.703043 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.805749 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.805796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.805810 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.805828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.805840 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.908614 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.908659 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.908671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.908688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:27 crc kubenswrapper[4759]: I1205 00:24:27.908699 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:27Z","lastTransitionTime":"2025-12-05T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.012269 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.012374 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.012387 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.012404 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.012418 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.114981 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.115070 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.115099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.115124 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.115141 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.155238 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.155363 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:28 crc kubenswrapper[4759]: E1205 00:24:28.155485 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:28 crc kubenswrapper[4759]: E1205 00:24:28.155626 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.217885 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.218005 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.218042 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.218075 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.218098 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.320275 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.320358 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.320377 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.320401 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.320616 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.424025 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.424087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.424102 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.424126 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.424141 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.527811 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.527883 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.527902 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.527928 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.527945 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.630273 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.630412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.630438 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.630471 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.630492 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.733190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.733260 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.733275 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.733327 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.733348 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.836057 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.836117 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.836128 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.836141 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.836149 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.939087 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.939136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.939148 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.939169 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:28 crc kubenswrapper[4759]: I1205 00:24:28.939180 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:28Z","lastTransitionTime":"2025-12-05T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.041338 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.041386 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.041399 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.041415 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.041423 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.144718 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.144755 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.144766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.144781 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.144792 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.155354 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.155451 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.155492 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.155597 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.166241 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.166388 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.166500 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.166589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.166681 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.182850 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.186004 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.186060 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.186078 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.186099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.186115 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.200971 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.206066 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.206278 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.206507 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.206752 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.206988 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.230129 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.235289 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.235355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.235369 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.235387 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.235399 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.252838 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.257458 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.257774 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.257845 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.257911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.258086 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.275146 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:29 crc kubenswrapper[4759]: E1205 00:24:29.275631 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.277271 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.277343 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.277360 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.277383 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.277399 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.379837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.379889 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.379905 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.379930 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.379946 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.482853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.482888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.482896 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.482911 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.482919 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.587730 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.587780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.587797 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.587822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.587839 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.691126 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.691188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.691206 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.691229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.691245 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.794849 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.794916 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.794937 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.794989 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.795006 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.898438 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.898526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.898743 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.898779 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:29 crc kubenswrapper[4759]: I1205 00:24:29.898806 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:29Z","lastTransitionTime":"2025-12-05T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.001729 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.001770 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.001783 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.001800 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.001811 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.105074 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.105150 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.105174 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.105210 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.105234 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.155850 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.155901 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:30 crc kubenswrapper[4759]: E1205 00:24:30.156137 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:30 crc kubenswrapper[4759]: E1205 00:24:30.156682 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.208555 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.208609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.208620 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.208637 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.208651 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.312272 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.312390 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.312408 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.312435 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.312457 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.415609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.415649 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.415657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.415672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.415682 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.517535 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.517577 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.517585 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.517598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.517608 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.620664 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.620702 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.620714 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.620733 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.620745 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.723715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.723755 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.723764 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.723778 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.723787 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.827500 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.827569 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.827588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.827615 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.827635 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.930669 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.930725 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.930737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.930756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:30 crc kubenswrapper[4759]: I1205 00:24:30.930768 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:30Z","lastTransitionTime":"2025-12-05T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.033683 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.033772 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.033797 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.033824 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.033842 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.136635 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.136691 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.136709 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.136732 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.136750 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.155287 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.155345 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:31 crc kubenswrapper[4759]: E1205 00:24:31.155547 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:31 crc kubenswrapper[4759]: E1205 00:24:31.155711 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.176952 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.194591 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.208630 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.225355 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.239022 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.239062 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.239073 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.239092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.239104 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.244531 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.258629 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.276112 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.299756 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef5ebbb67aea1a7d787cd869b17eb0c89e09a2c21db86d2da7e26a85d002d5e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:23:50Z\\\",\\\"message\\\":\\\"ller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"18746a4d-8a63-458a-b7e3-8fb89ff95fc0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-route-controller-manager/route-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.239\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 00:23:49.103226 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\".go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 00:24:18.660518 6661 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mbhwx\\\\nI1205 00:24:18.660483 6661 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-5q8ns after 0 failed attempt(s)\\\\nI1205 00:24:18.660537 6661 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-5q8ns\\\\nI1205 00:24:18.660454 6661 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-wdk4j\\\\nI1205 00:24:18.660559 6661 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-wdk4j in node crc\\\\nF1205 00:24:18.660572 6661 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling we\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.317564 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.333289 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.341488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.341536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.341552 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.341577 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.341594 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.354647 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb12a29-4683-4427-bce3-b0729c76a7c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.369253 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.383135 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.396446 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.411141 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bde3f77e-2483-47db-9f9f-b9e73033a1ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cbb649eb107e2fd5b9575d770967c103b7f599f57b6f71d8af4b940a1bc0be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.429095 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.440748 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.444346 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.444472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.444539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.444601 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.444666 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.455177 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.466538 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.546785 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.546867 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.546884 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.546921 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.546956 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.649408 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.649456 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.649472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.649494 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.649511 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.751864 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.751908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.751926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.751948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.751967 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.855135 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.855751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.855987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.856170 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.856474 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.959881 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.959955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.959968 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.959991 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:31 crc kubenswrapper[4759]: I1205 00:24:31.960002 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:31Z","lastTransitionTime":"2025-12-05T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.063284 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.063354 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.063368 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.063386 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.063402 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.155605 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.155789 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:32 crc kubenswrapper[4759]: E1205 00:24:32.156164 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.156451 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:24:32 crc kubenswrapper[4759]: E1205 00:24:32.156650 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:24:32 crc kubenswrapper[4759]: E1205 00:24:32.156813 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.165997 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.166105 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.166225 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.166246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.166259 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.172788 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6ca2f36-241c-41cb-9d1d-d6856e819953\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rfgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ksxg9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.197605 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb12a29-4683-4427-bce3-b0729c76a7c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52e2f9883f6f025e438c9f6cb52d12f65084e6931517f3b135614c9113c0804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd723f0bd311cc4164d0f6215eca13aaae99ca736ba3fdf3b6c3750552a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9711e058b4e14a3533033db36c3146dc3ddaccd2daf17d3129343bb782f38f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b951da0c48cff251b767d024410ea672ffb21da5638ea463b5e08ca22211e11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2b27cbaed139042437e0946ecba9e302ec53616546a6e4fc0e9352a462d99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b389b8ca0d964bda15fd21a6946ab541835edf0ea5da70447bb335b19c6fb7cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba1919c10a9e5bb50b8da644e72503a19c98b6926ac5ce8db89daacba02ba74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22bbd9c46c064891006a7700d603ec05ad89f5ddaa4de5276e91d696d57b9d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.213212 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e2335ca4c3a909025e386eb1a159046b2b765ab19b53f2fd2b85b931535e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.227204 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.243040 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51a56acd-2ed2-498f-bcd1-93cd4ce2a21c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdfc292691dbcea0cf78c0313373a8c628b8a8efc4f043123a6a1a368027224c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2db5e9295d1b019730bfb4909765a7a5e1774a10b0652587933c0fc660d719cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371d1bc3a1bb4e83890acdba4d027300daf84c222a060542cebe07c27b6b5eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3a28338f482b1beab5927646c0dfa53f11009de278058d6254c9a07dd25d5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91f98d8f919e755cdd6ba81feb5b9ac3e4105186c92f3dcf96677db1eff5db30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2bbd21399225ade8375cec38bcd5e77dc2bf485bff66e807bf1fc26df48f5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295a25dbd3e29e739df0b25a5e2b45354e5389034459cea68a260f2a160b142c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqmrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdk4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.268614 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.269010 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.269199 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.269456 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.269670 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.269169 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45fa490b-1113-4ee6-9604-dc322ca11bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:18Z\\\",\\\"message\\\":\\\".go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 00:24:18.660518 6661 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mbhwx\\\\nI1205 00:24:18.660483 6661 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-5q8ns after 0 failed attempt(s)\\\\nI1205 00:24:18.660537 6661 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-5q8ns\\\\nI1205 00:24:18.660454 6661 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-wdk4j\\\\nI1205 00:24:18.660559 6661 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-wdk4j in node crc\\\\nF1205 00:24:18.660572 6661 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling we\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:24:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t67cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mbhwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.286534 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b4f1c6b-7070-450f-8187-881125eec0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed13cb7e66b4a95afeead313c80afe3cbff133945a90b0d48ab80db16b501da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05588657c148300f1067c771e77ac98eeab1b6cdb27ff4e1a42505d3bc746c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spg8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.297792 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bde3f77e-2483-47db-9f9f-b9e73033a1ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cbb649eb107e2fd5b9575d770967c103b7f599f57b6f71d8af4b940a1bc0be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d53281ad03493aa2b3a99a88d7a554601c10252d24d053d0e6b03256f0a314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.311778 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6484a10c-6456-4217-bb2c-3a0928ec2328\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7c952ec24c5ae4bf5bccf945f1208468c014e2d385b0643d4f13af9865fecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60a7572182a33f1050be42edb85f50fe7f449e6a3fa311cef957d8b2faa9b0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f209700778e192171446e0d26b90e421f48173264bbc088e42920d28098c7e96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.326287 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-llpn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33957c4-8ef0-4b57-8e3c-183091f3b022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T00:24:10Z\\\",\\\"message\\\":\\\"2025-12-05T00:23:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1\\\\n2025-12-05T00:23:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02834512-c27a-4373-b936-e57ac05aacc1 to /host/opt/cni/bin/\\\\n2025-12-05T00:23:25Z [verbose] multus-daemon started\\\\n2025-12-05T00:23:25Z [verbose] Readiness Indicator file check\\\\n2025-12-05T00:24:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wwnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-llpn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.339844 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnqtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0928c1-8104-4803-bf39-f48da5f1fec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e7a48a4a77dee508ec0cb3261a2a4a4a9c8ed707587cb1101872b536294ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b48z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnqtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.353744 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae01012ad880a0f0e19b8a24c91f3fb682f4188ebdfcea9d4210fc3a9e120bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.369041 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.371948 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.372119 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.372222 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.372338 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.372441 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.381476 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"879c79ed-3fea-4896-84a5-e3c44d13a0c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c6c177c1f3e66c02c30ced81a45430dbaeafb623070b0ad910bc77a3365f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n5bwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5q8ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.395983 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef44e93e-b269-459c-b2ae-22a70267bc87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 00:23:15.159927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 00:23:15.162375 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1228208911/tls.crt::/tmp/serving-cert-1228208911/tls.key\\\\\\\"\\\\nI1205 00:23:21.451050 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 00:23:21.453281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 00:23:21.453324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 00:23:21.453410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 00:23:21.453436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 00:23:21.481118 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 00:23:21.481140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 00:23:21.481167 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 00:23:21.481172 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 00:23:21.481176 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 00:23:21.481180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 00:23:21.481376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 00:23:21.482509 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.409223 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06493027-6840-495a-b874-24cd666119e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db87ee4414d125d3b2c7793912651ef8da8b812b7cfe36a032bc5ac1bdc9ba84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ff5c2d198ddc477a01bd417061044b131461ef5cffdce4c56290296fdbb8140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c52ab59e908cca2e0d8e2cc5b808fb09cc893deb1fa77d516369f36f8bc6dbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6d623f801a98b8e879ed9167254b631fe85808cd57508fb9c60c4f03338c3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T00:23:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T00:23:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.422541 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.436420 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041971190aebb274ac649ee59edd956552078dc8529f09e3a2a0b7cea1891bdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a42a51e0c8259b1b409f811948c9d44a04d9d5538641c5b1c38f57e3a3a7119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.446241 4759 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7lmmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e42923da-2632-4c20-a3e8-26d46dccd346\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T00:23:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e3853eaa57250da62fedfa0a47453b2e3b21cbe77bb7ee5c26dc31405b25263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T00:23:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kk77p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T00:23:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7lmmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.475267 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.475335 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.475350 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.475368 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.475378 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.577686 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.577715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.577724 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.577737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.577745 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.680755 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.680808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.680819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.680839 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.680852 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.783372 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.783501 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.783581 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.783615 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.783695 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.887396 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.887463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.887488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.887520 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.887543 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.989784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.989853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.989879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.989908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:32 crc kubenswrapper[4759]: I1205 00:24:32.989932 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:32Z","lastTransitionTime":"2025-12-05T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.092011 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.092043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.092051 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.092064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.092072 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.155697 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.155910 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:33 crc kubenswrapper[4759]: E1205 00:24:33.156192 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:33 crc kubenswrapper[4759]: E1205 00:24:33.156049 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.194489 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.194536 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.194545 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.194558 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.194568 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.296366 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.296419 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.296429 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.296442 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.296451 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.399503 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.399582 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.399605 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.399636 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.399658 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.503324 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.503373 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.503385 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.503400 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.503410 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.606916 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.606955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.606966 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.606983 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.606993 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.709715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.709807 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.709838 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.709874 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.709900 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.812463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.812509 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.812518 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.812533 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.812542 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.915509 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.915562 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.915573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.915591 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:33 crc kubenswrapper[4759]: I1205 00:24:33.915603 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:33Z","lastTransitionTime":"2025-12-05T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.017437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.017480 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.017490 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.017505 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.017514 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.119927 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.120284 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.120504 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.120635 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.120728 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.155584 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:34 crc kubenswrapper[4759]: E1205 00:24:34.155743 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.155610 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:34 crc kubenswrapper[4759]: E1205 00:24:34.156070 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.223298 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.223972 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.224045 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.224114 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.224223 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.327078 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.327155 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.327187 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.327219 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.327243 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.429403 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.429449 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.429460 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.429476 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.429487 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.531794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.531855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.531867 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.531886 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.531899 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.634171 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.634268 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.634339 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.634373 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.634396 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.736568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.736629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.736645 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.736666 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.736682 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.839394 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.839446 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.839460 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.839478 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.839490 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.942558 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.942615 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.942645 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.942668 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:34 crc kubenswrapper[4759]: I1205 00:24:34.942679 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:34Z","lastTransitionTime":"2025-12-05T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.046658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.046737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.046763 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.046797 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.046821 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.149067 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.149110 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.149123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.149138 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.149150 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.155583 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.155646 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:35 crc kubenswrapper[4759]: E1205 00:24:35.155853 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:35 crc kubenswrapper[4759]: E1205 00:24:35.156055 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.251705 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.251740 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.251751 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.251766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.251777 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.354924 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.354983 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.354998 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.355016 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.355029 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.457568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.457617 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.457626 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.457641 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.457650 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.560713 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.560796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.560820 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.560852 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.560876 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.664229 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.664292 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.664352 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.664378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.664397 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.770565 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.770612 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.770623 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.770639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.770650 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.873336 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.873376 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.873385 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.873399 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.873411 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.975126 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.975161 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.975169 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.975181 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:35 crc kubenswrapper[4759]: I1205 00:24:35.975191 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:35Z","lastTransitionTime":"2025-12-05T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.077364 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.077421 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.077437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.077461 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.077479 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.154972 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.155008 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:36 crc kubenswrapper[4759]: E1205 00:24:36.155215 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:36 crc kubenswrapper[4759]: E1205 00:24:36.155317 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.180377 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.180488 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.180507 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.180584 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.180609 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.282805 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.282854 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.282862 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.282876 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.282887 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.386414 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.386449 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.386458 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.386473 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.386483 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.489287 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.489357 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.489373 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.489394 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.489411 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.591794 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.591830 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.591841 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.591856 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.591866 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.694482 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.694546 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.694564 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.694588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.694608 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.797588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.797699 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.797718 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.797741 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.797759 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.900235 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.900352 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.900388 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.900417 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:36 crc kubenswrapper[4759]: I1205 00:24:36.900476 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:36Z","lastTransitionTime":"2025-12-05T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.003086 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.003121 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.003132 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.003147 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.003157 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.106194 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.106239 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.106254 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.106275 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.106289 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.155020 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.155134 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:37 crc kubenswrapper[4759]: E1205 00:24:37.155260 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:37 crc kubenswrapper[4759]: E1205 00:24:37.155450 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.208608 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.208656 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.208672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.208691 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.208702 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.310917 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.310956 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.310967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.310981 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.310993 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.414151 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.414223 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.414246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.414274 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.414295 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.516627 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.516693 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.516730 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.516765 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.516787 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.619645 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.619681 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.619700 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.619716 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.619729 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.722957 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.723026 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.723047 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.723076 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.723099 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.826792 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.826871 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.826891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.826923 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.826943 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.929660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.929733 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.929759 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.929967 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:37 crc kubenswrapper[4759]: I1205 00:24:37.929997 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:37Z","lastTransitionTime":"2025-12-05T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.032539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.032564 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.032572 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.032584 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.032594 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.135640 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.135700 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.135718 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.135737 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.135749 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.155171 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.155193 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:38 crc kubenswrapper[4759]: E1205 00:24:38.155357 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:38 crc kubenswrapper[4759]: E1205 00:24:38.155483 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.237481 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.237527 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.237542 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.237559 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.237571 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.340699 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.341424 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.341526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.341625 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.341717 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.444372 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.444423 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.444435 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.444452 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.444464 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.547601 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.547666 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.547682 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.547706 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.547724 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.651116 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.651189 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.651209 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.651230 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.651246 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.755276 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.755530 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.755586 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.755622 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.755655 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.858357 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.858401 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.858412 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.858429 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.858441 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.961241 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.961286 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.961301 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.961446 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:38 crc kubenswrapper[4759]: I1205 00:24:38.961462 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:38Z","lastTransitionTime":"2025-12-05T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.063430 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.063469 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.063477 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.063491 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.063500 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.155566 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.155703 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.155572 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.155952 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.165836 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.165871 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.165879 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.165891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.165899 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.269165 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.269219 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.269232 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.269253 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.269280 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.372987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.373036 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.373054 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.373075 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.373089 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.475520 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.475696 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.475715 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.475735 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.475750 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.562596 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.562645 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.562657 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.562674 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.562685 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.576177 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.580197 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.580262 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.580285 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.580321 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.580379 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.593574 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.597903 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.597958 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.597979 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.598008 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.598031 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.613038 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.617086 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.617144 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.617160 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.617185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.617197 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.629888 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.633540 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.633567 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.633579 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.633598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.633610 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.643976 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T00:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3291f5b6-2cb9-45e1-be99-f12561717489\\\",\\\"systemUUID\\\":\\\"26c1e85a-f767-4d62-bae3-5a75555c0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 05 00:24:39 crc kubenswrapper[4759]: E1205 00:24:39.644081 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.645533 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.645579 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.645592 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.645611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.645626 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.748604 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.748665 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.748677 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.748694 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.748706 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.851153 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.851199 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.851213 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.851233 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.851246 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.954600 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.954640 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.954653 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.954671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:39 crc kubenswrapper[4759]: I1205 00:24:39.954689 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:39Z","lastTransitionTime":"2025-12-05T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.057803 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.057855 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.057870 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.057891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.057905 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.155334 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.155391 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:40 crc kubenswrapper[4759]: E1205 00:24:40.155629 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:40 crc kubenswrapper[4759]: E1205 00:24:40.155787 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.160378 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.160433 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.160449 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.160472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.160486 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.262507 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.262542 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.262551 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.262564 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.262577 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.365761 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.365810 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.365823 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.365840 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.365852 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.467921 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.467964 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.467976 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.467990 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.468002 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.571066 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.571110 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.571119 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.571135 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.571144 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.670897 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:40 crc kubenswrapper[4759]: E1205 00:24:40.671149 4759 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:24:40 crc kubenswrapper[4759]: E1205 00:24:40.671279 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs podName:f6ca2f36-241c-41cb-9d1d-d6856e819953 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:44.671248646 +0000 UTC m=+163.886909636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs") pod "network-metrics-daemon-ksxg9" (UID: "f6ca2f36-241c-41cb-9d1d-d6856e819953") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.673127 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.673173 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.673188 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.673206 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.673217 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.775444 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.775493 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.775503 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.775521 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.775532 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.878107 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.878174 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.878185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.878202 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.878213 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.980463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.980525 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.980545 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.980574 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:40 crc kubenswrapper[4759]: I1205 00:24:40.980595 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:40Z","lastTransitionTime":"2025-12-05T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.083451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.083527 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.083546 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.083569 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.083585 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.155754 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.155845 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:41 crc kubenswrapper[4759]: E1205 00:24:41.156047 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:41 crc kubenswrapper[4759]: E1205 00:24:41.156169 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.188393 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.188468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.188495 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.188530 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.188594 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.205603 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spg8n" podStartSLOduration=79.205584129 podStartE2EDuration="1m19.205584129s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.185010221 +0000 UTC m=+100.400671251" watchObservedRunningTime="2025-12-05 00:24:41.205584129 +0000 UTC m=+100.421245079" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.234711 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=23.234690888 podStartE2EDuration="23.234690888s" podCreationTimestamp="2025-12-05 00:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.23307439 +0000 UTC m=+100.448735340" watchObservedRunningTime="2025-12-05 00:24:41.234690888 +0000 UTC m=+100.450351838" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.298822 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.298877 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.298886 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.298898 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.298907 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.322443 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wdk4j" podStartSLOduration=80.322422516 podStartE2EDuration="1m20.322422516s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.295892268 +0000 UTC m=+100.511553228" watchObservedRunningTime="2025-12-05 00:24:41.322422516 +0000 UTC m=+100.538083466" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.352643 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.352619711 podStartE2EDuration="17.352619711s" podCreationTimestamp="2025-12-05 00:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.336911729 +0000 UTC m=+100.552572719" watchObservedRunningTime="2025-12-05 00:24:41.352619711 +0000 UTC m=+100.568280661" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.353101 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.353095232 podStartE2EDuration="1m17.353095232s" podCreationTimestamp="2025-12-05 00:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.352550629 +0000 UTC m=+100.568211579" watchObservedRunningTime="2025-12-05 00:24:41.353095232 +0000 UTC m=+100.568756182" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.365637 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-llpn6" podStartSLOduration=80.365620598 podStartE2EDuration="1m20.365620598s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.364760219 +0000 UTC m=+100.580421169" watchObservedRunningTime="2025-12-05 00:24:41.365620598 +0000 UTC m=+100.581281548" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.373739 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tnqtq" podStartSLOduration=80.37372155 podStartE2EDuration="1m20.37372155s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.373461554 +0000 UTC m=+100.589122494" watchObservedRunningTime="2025-12-05 00:24:41.37372155 +0000 UTC m=+100.589382500" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.401169 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.401215 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.401226 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.401240 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.401249 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.429231 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podStartSLOduration=79.429216265 podStartE2EDuration="1m19.429216265s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.414340162 +0000 UTC m=+100.630001112" watchObservedRunningTime="2025-12-05 00:24:41.429216265 +0000 UTC m=+100.644877215" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.444523 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.444506357 podStartE2EDuration="47.444506357s" podCreationTimestamp="2025-12-05 00:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.443898692 +0000 UTC m=+100.659559642" watchObservedRunningTime="2025-12-05 00:24:41.444506357 +0000 UTC m=+100.660167297" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.444705 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.444699422 podStartE2EDuration="1m19.444699422s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.429940942 +0000 UTC m=+100.645601892" watchObservedRunningTime="2025-12-05 00:24:41.444699422 +0000 UTC m=+100.660360372" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.483612 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7lmmf" podStartSLOduration=80.483583073 podStartE2EDuration="1m20.483583073s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:41.482529118 +0000 UTC m=+100.698190088" watchObservedRunningTime="2025-12-05 00:24:41.483583073 +0000 UTC m=+100.699244013" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.503796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.503860 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.503872 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.503907 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.503917 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.606064 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.606120 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.606136 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.606157 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.606172 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.709522 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.709559 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.709570 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.709588 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.709602 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.812782 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.812853 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.812868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.812890 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.812908 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.915544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.915584 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.915595 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.915611 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:41 crc kubenswrapper[4759]: I1205 00:24:41.915623 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:41Z","lastTransitionTime":"2025-12-05T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.018327 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.018376 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.018384 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.018399 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.018409 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.120510 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.120542 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.120554 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.120569 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.120580 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.155465 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.155475 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:42 crc kubenswrapper[4759]: E1205 00:24:42.155714 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:42 crc kubenswrapper[4759]: E1205 00:24:42.155756 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.224678 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.224756 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.224768 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.224787 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.224823 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.326548 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.326581 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.326589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.326602 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.326611 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.429466 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.429525 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.429537 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.429556 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.429568 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.531828 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.531937 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.531949 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.531987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.532000 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.634868 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.634908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.634920 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.634936 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.634947 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.737549 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.737623 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.737638 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.737653 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.737662 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.840696 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.840754 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.840766 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.840784 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.840795 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.943780 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.943837 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.943848 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.943864 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:42 crc kubenswrapper[4759]: I1205 00:24:42.943875 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:42Z","lastTransitionTime":"2025-12-05T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.047480 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.047545 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.047568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.047598 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.047618 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.150560 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.150617 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.150635 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.150655 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.150667 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.154983 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.154998 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:43 crc kubenswrapper[4759]: E1205 00:24:43.155382 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:43 crc kubenswrapper[4759]: E1205 00:24:43.155495 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.155645 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:24:43 crc kubenswrapper[4759]: E1205 00:24:43.155793 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.252742 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.252783 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.252792 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.252809 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.252825 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.355154 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.355198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.355212 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.355228 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.355239 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.458355 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.458396 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.458406 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.458422 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.458432 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.560762 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.560808 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.560818 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.560835 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.560846 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.663826 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.663866 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.663880 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.663899 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.663911 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.766144 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.766185 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.766194 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.766210 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.766219 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.869111 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.869178 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.869194 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.869218 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.869235 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.972187 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.972236 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.972246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.972264 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:43 crc kubenswrapper[4759]: I1205 00:24:43.972277 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:43Z","lastTransitionTime":"2025-12-05T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.074293 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.074439 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.074461 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.074490 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.074510 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.155105 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:44 crc kubenswrapper[4759]: E1205 00:24:44.155265 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.155369 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:44 crc kubenswrapper[4759]: E1205 00:24:44.155836 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.177117 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.177192 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.177214 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.177246 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.177266 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.279746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.279787 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.279798 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.279814 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.279825 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.381881 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.381919 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.381929 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.381955 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.381968 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.484556 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.484597 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.484609 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.484624 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.484636 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.589643 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.589691 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.589704 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.589722 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.589873 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.693401 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.693480 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.693505 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.693534 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.693555 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.795547 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.795608 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.795621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.795639 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.795653 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.898483 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.898517 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.898525 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.898538 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:44 crc kubenswrapper[4759]: I1205 00:24:44.898547 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:44Z","lastTransitionTime":"2025-12-05T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.000090 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.000124 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.000151 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.000176 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.000186 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.102368 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.102448 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.102463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.102481 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.102493 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.154996 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.155020 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:45 crc kubenswrapper[4759]: E1205 00:24:45.155111 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:45 crc kubenswrapper[4759]: E1205 00:24:45.155208 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.204660 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.204716 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.204741 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.204762 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.204778 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.308008 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.308049 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.308062 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.308078 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.308089 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.411192 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.411249 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.411259 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.411279 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.411292 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.513789 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.513856 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.513867 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.513888 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.513901 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.617155 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.617193 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.617205 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.617222 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.617234 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.720284 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.720351 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.720360 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.720376 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.720387 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.823531 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.823607 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.823629 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.823658 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.823683 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.926672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.926749 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.926773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.926802 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:45 crc kubenswrapper[4759]: I1205 00:24:45.926823 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:45Z","lastTransitionTime":"2025-12-05T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.029058 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.029099 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.029109 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.029123 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.029135 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.130965 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.131014 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.131027 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.131043 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.131054 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.155423 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:46 crc kubenswrapper[4759]: E1205 00:24:46.155556 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.155434 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:46 crc kubenswrapper[4759]: E1205 00:24:46.155718 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.233842 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.233883 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.233893 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.233907 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.233919 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.337496 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.337557 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.337581 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.337612 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.337634 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.440465 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.440526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.440544 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.440573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.440592 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.544621 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.544773 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.544795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.544859 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.544882 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.648610 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.648661 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.648672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.648688 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.648697 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.751191 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.751252 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.751267 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.751288 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.751308 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.853463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.853535 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.853552 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.853576 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.853592 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.955202 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.955252 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.955261 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.955273 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:46 crc kubenswrapper[4759]: I1205 00:24:46.955282 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:46Z","lastTransitionTime":"2025-12-05T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.058184 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.058241 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.058252 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.058270 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.058282 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.154890 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:47 crc kubenswrapper[4759]: E1205 00:24:47.155027 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.155284 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:47 crc kubenswrapper[4759]: E1205 00:24:47.155594 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.160399 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.160573 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.160655 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.160734 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.160825 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.264038 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.264076 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.264088 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.264105 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.264117 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.367115 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.367489 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.367615 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.367752 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.367855 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.470400 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.470436 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.470443 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.470457 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.470466 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.573501 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.573551 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.573568 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.573589 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.573604 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.675987 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.676276 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.676299 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.676451 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.676508 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.779707 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.779796 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.779819 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.779869 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.779914 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.882856 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.882917 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.882929 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.882946 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.882957 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.985670 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.985717 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.985728 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.985742 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:47 crc kubenswrapper[4759]: I1205 00:24:47.985753 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:47Z","lastTransitionTime":"2025-12-05T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.012454 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.013703 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:24:48 crc kubenswrapper[4759]: E1205 00:24:48.013977 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mbhwx_openshift-ovn-kubernetes(45fa490b-1113-4ee6-9604-dc322ca11bd3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.088606 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.088671 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.088692 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.088721 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.088742 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.155738 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.155738 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:48 crc kubenswrapper[4759]: E1205 00:24:48.155962 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:48 crc kubenswrapper[4759]: E1205 00:24:48.156071 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.191463 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.191527 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.191540 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.191557 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.191570 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.294795 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.294885 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.294926 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.294951 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.294996 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.398944 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.399239 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.399365 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.399470 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.399562 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.502801 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.502851 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.502863 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.502878 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.502890 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.605100 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.605147 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.605160 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.605178 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.605194 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.708056 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.708084 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.708092 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.708106 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.708117 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.810915 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.810960 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.810968 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.810985 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.810994 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.913974 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.914283 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.914410 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.914493 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:48 crc kubenswrapper[4759]: I1205 00:24:48.914563 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:48Z","lastTransitionTime":"2025-12-05T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.016841 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.016881 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.016891 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.016908 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.016923 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.120437 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.120504 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.120526 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.120554 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.120578 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.154856 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:49 crc kubenswrapper[4759]: E1205 00:24:49.154992 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.154856 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:49 crc kubenswrapper[4759]: E1205 00:24:49.155217 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.223539 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.223606 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.223619 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.223641 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.223654 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.325811 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.326122 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.326198 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.326286 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.326387 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.429113 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.429159 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.429172 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.429190 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.429202 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.531468 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.531933 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.532082 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.532234 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.532401 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.635672 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.635716 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.635727 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.635746 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.635756 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.738344 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.738432 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.738450 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.738472 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.738486 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.841274 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.841331 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.841345 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.841364 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.841374 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.903831 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.903882 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.903892 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.903912 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.903925 4759 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T00:24:49Z","lastTransitionTime":"2025-12-05T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.974840 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t"] Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.975578 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.978189 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.980078 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.980241 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 00:24:49 crc kubenswrapper[4759]: I1205 00:24:49.980344 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.075366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fac495-3246-42e6-bf04-05a494ff18c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.075419 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fac495-3246-42e6-bf04-05a494ff18c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.075452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5fac495-3246-42e6-bf04-05a494ff18c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.075474 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.075564 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.155327 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.155417 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:50 crc kubenswrapper[4759]: E1205 00:24:50.155475 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:50 crc kubenswrapper[4759]: E1205 00:24:50.155577 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177205 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fac495-3246-42e6-bf04-05a494ff18c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177287 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5fac495-3246-42e6-bf04-05a494ff18c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177358 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177393 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177440 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fac495-3246-42e6-bf04-05a494ff18c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177666 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.177787 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5fac495-3246-42e6-bf04-05a494ff18c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.178283 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fac495-3246-42e6-bf04-05a494ff18c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.184248 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fac495-3246-42e6-bf04-05a494ff18c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.197370 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5fac495-3246-42e6-bf04-05a494ff18c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f589t\" (UID: \"b5fac495-3246-42e6-bf04-05a494ff18c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.303056 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.814363 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" event={"ID":"b5fac495-3246-42e6-bf04-05a494ff18c0","Type":"ContainerStarted","Data":"d03bd2532c8cffed6980d51fc3dfe4d20ab220712b7dae8704b538e3b1adf325"} Dec 05 00:24:50 crc kubenswrapper[4759]: I1205 00:24:50.814435 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" event={"ID":"b5fac495-3246-42e6-bf04-05a494ff18c0","Type":"ContainerStarted","Data":"cabb707613ea8ea04bd333e190ed0963c01e0a032d18102390e146537c6bb852"} Dec 05 00:24:51 crc kubenswrapper[4759]: I1205 00:24:51.155848 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:51 crc kubenswrapper[4759]: I1205 00:24:51.155848 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:51 crc kubenswrapper[4759]: E1205 00:24:51.157160 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:51 crc kubenswrapper[4759]: E1205 00:24:51.157481 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:52 crc kubenswrapper[4759]: I1205 00:24:52.154746 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:52 crc kubenswrapper[4759]: I1205 00:24:52.154798 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:52 crc kubenswrapper[4759]: E1205 00:24:52.155030 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:52 crc kubenswrapper[4759]: E1205 00:24:52.155191 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:53 crc kubenswrapper[4759]: I1205 00:24:53.155732 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:53 crc kubenswrapper[4759]: I1205 00:24:53.155786 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:53 crc kubenswrapper[4759]: E1205 00:24:53.155898 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:53 crc kubenswrapper[4759]: E1205 00:24:53.157132 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:54 crc kubenswrapper[4759]: I1205 00:24:54.155343 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:54 crc kubenswrapper[4759]: I1205 00:24:54.155408 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:54 crc kubenswrapper[4759]: E1205 00:24:54.155463 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:54 crc kubenswrapper[4759]: E1205 00:24:54.155601 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:55 crc kubenswrapper[4759]: I1205 00:24:55.154872 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:55 crc kubenswrapper[4759]: I1205 00:24:55.154985 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:55 crc kubenswrapper[4759]: E1205 00:24:55.155095 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:55 crc kubenswrapper[4759]: E1205 00:24:55.155254 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:56 crc kubenswrapper[4759]: I1205 00:24:56.155562 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:56 crc kubenswrapper[4759]: I1205 00:24:56.155587 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:56 crc kubenswrapper[4759]: E1205 00:24:56.155772 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:56 crc kubenswrapper[4759]: E1205 00:24:56.155918 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.155108 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.155136 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:57 crc kubenswrapper[4759]: E1205 00:24:57.155349 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:57 crc kubenswrapper[4759]: E1205 00:24:57.155436 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.839490 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/1.log" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.839937 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/0.log" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.839987 4759 generic.go:334] "Generic (PLEG): container finished" podID="b33957c4-8ef0-4b57-8e3c-183091f3b022" containerID="4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3" exitCode=1 Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.840024 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerDied","Data":"4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3"} Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.840063 4759 scope.go:117] "RemoveContainer" containerID="91e4c79b8435bcdb90acf6fc7cf3fd14ac52a4de8f0c4ad493059bbe15b4ff1d" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.840448 4759 scope.go:117] "RemoveContainer" containerID="4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3" Dec 05 00:24:57 crc kubenswrapper[4759]: E1205 00:24:57.840620 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-llpn6_openshift-multus(b33957c4-8ef0-4b57-8e3c-183091f3b022)\"" pod="openshift-multus/multus-llpn6" podUID="b33957c4-8ef0-4b57-8e3c-183091f3b022" Dec 05 00:24:57 crc kubenswrapper[4759]: I1205 00:24:57.858902 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f589t" podStartSLOduration=96.858863116 podStartE2EDuration="1m36.858863116s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:24:50.836534109 +0000 UTC m=+110.052195089" watchObservedRunningTime="2025-12-05 00:24:57.858863116 +0000 UTC m=+117.074524066" Dec 05 00:24:58 crc kubenswrapper[4759]: I1205 00:24:58.155532 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:24:58 crc kubenswrapper[4759]: I1205 00:24:58.155549 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:24:58 crc kubenswrapper[4759]: E1205 00:24:58.155831 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:24:58 crc kubenswrapper[4759]: E1205 00:24:58.156020 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:24:58 crc kubenswrapper[4759]: I1205 00:24:58.845654 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/1.log" Dec 05 00:24:59 crc kubenswrapper[4759]: I1205 00:24:59.155528 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:24:59 crc kubenswrapper[4759]: I1205 00:24:59.155612 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:24:59 crc kubenswrapper[4759]: E1205 00:24:59.155703 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:24:59 crc kubenswrapper[4759]: E1205 00:24:59.155776 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:00 crc kubenswrapper[4759]: I1205 00:25:00.154926 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:00 crc kubenswrapper[4759]: I1205 00:25:00.154948 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:00 crc kubenswrapper[4759]: E1205 00:25:00.155100 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:00 crc kubenswrapper[4759]: E1205 00:25:00.155201 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:01 crc kubenswrapper[4759]: E1205 00:25:01.005200 4759 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 00:25:01 crc kubenswrapper[4759]: I1205 00:25:01.154819 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:01 crc kubenswrapper[4759]: I1205 00:25:01.154927 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:01 crc kubenswrapper[4759]: E1205 00:25:01.157261 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:01 crc kubenswrapper[4759]: E1205 00:25:01.157530 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:01 crc kubenswrapper[4759]: E1205 00:25:01.269257 4759 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.154678 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.154678 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.155239 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:25:02 crc kubenswrapper[4759]: E1205 00:25:02.155476 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:02 crc kubenswrapper[4759]: E1205 00:25:02.155547 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.867711 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/3.log" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.877792 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerStarted","Data":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.878502 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:25:02 crc kubenswrapper[4759]: I1205 00:25:02.918716 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podStartSLOduration=100.918699068 podStartE2EDuration="1m40.918699068s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:02.917046019 +0000 UTC m=+122.132706989" watchObservedRunningTime="2025-12-05 00:25:02.918699068 +0000 UTC m=+122.134360028" Dec 05 00:25:03 crc kubenswrapper[4759]: I1205 00:25:03.428996 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:03 crc kubenswrapper[4759]: E1205 00:25:03.429103 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:03 crc kubenswrapper[4759]: I1205 00:25:03.429000 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:03 crc kubenswrapper[4759]: I1205 00:25:03.429000 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:03 crc kubenswrapper[4759]: E1205 00:25:03.429179 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:03 crc kubenswrapper[4759]: E1205 00:25:03.429228 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:04 crc kubenswrapper[4759]: I1205 00:25:04.040157 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ksxg9"] Dec 05 00:25:04 crc kubenswrapper[4759]: I1205 00:25:04.040274 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:04 crc kubenswrapper[4759]: E1205 00:25:04.040374 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:04 crc kubenswrapper[4759]: I1205 00:25:04.155556 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:04 crc kubenswrapper[4759]: E1205 00:25:04.155763 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:05 crc kubenswrapper[4759]: I1205 00:25:05.155353 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:05 crc kubenswrapper[4759]: I1205 00:25:05.155444 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:05 crc kubenswrapper[4759]: E1205 00:25:05.155477 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:05 crc kubenswrapper[4759]: E1205 00:25:05.155581 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:06 crc kubenswrapper[4759]: I1205 00:25:06.155687 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:06 crc kubenswrapper[4759]: I1205 00:25:06.155685 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:06 crc kubenswrapper[4759]: E1205 00:25:06.155887 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:06 crc kubenswrapper[4759]: E1205 00:25:06.155968 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:06 crc kubenswrapper[4759]: E1205 00:25:06.270354 4759 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:25:07 crc kubenswrapper[4759]: I1205 00:25:07.154726 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:07 crc kubenswrapper[4759]: E1205 00:25:07.154891 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:07 crc kubenswrapper[4759]: I1205 00:25:07.155024 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:07 crc kubenswrapper[4759]: E1205 00:25:07.155155 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:08 crc kubenswrapper[4759]: I1205 00:25:08.155415 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:08 crc kubenswrapper[4759]: I1205 00:25:08.155422 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:08 crc kubenswrapper[4759]: E1205 00:25:08.155600 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:08 crc kubenswrapper[4759]: E1205 00:25:08.155710 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:09 crc kubenswrapper[4759]: I1205 00:25:09.156495 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:09 crc kubenswrapper[4759]: I1205 00:25:09.156546 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:09 crc kubenswrapper[4759]: E1205 00:25:09.156656 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:09 crc kubenswrapper[4759]: E1205 00:25:09.156747 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:10 crc kubenswrapper[4759]: I1205 00:25:10.155543 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:10 crc kubenswrapper[4759]: I1205 00:25:10.155559 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:10 crc kubenswrapper[4759]: E1205 00:25:10.155742 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:10 crc kubenswrapper[4759]: E1205 00:25:10.155907 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:11 crc kubenswrapper[4759]: I1205 00:25:11.155650 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:11 crc kubenswrapper[4759]: I1205 00:25:11.156953 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:11 crc kubenswrapper[4759]: I1205 00:25:11.157052 4759 scope.go:117] "RemoveContainer" containerID="4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3" Dec 05 00:25:11 crc kubenswrapper[4759]: E1205 00:25:11.157090 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:11 crc kubenswrapper[4759]: E1205 00:25:11.157149 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:11 crc kubenswrapper[4759]: E1205 00:25:11.271059 4759 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:25:11 crc kubenswrapper[4759]: I1205 00:25:11.907519 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/1.log" Dec 05 00:25:11 crc kubenswrapper[4759]: I1205 00:25:11.907602 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerStarted","Data":"5a1fb84e174a1d7c7e9b9de4af4a4362a0de80ff5696218ca1ee9d80afd8d3a1"} Dec 05 00:25:12 crc kubenswrapper[4759]: I1205 00:25:12.155010 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:12 crc kubenswrapper[4759]: E1205 00:25:12.155169 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:12 crc kubenswrapper[4759]: I1205 00:25:12.155268 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:12 crc kubenswrapper[4759]: E1205 00:25:12.155369 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:13 crc kubenswrapper[4759]: I1205 00:25:13.154701 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:13 crc kubenswrapper[4759]: I1205 00:25:13.154768 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:13 crc kubenswrapper[4759]: E1205 00:25:13.154869 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:13 crc kubenswrapper[4759]: E1205 00:25:13.155187 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:14 crc kubenswrapper[4759]: I1205 00:25:14.155075 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:14 crc kubenswrapper[4759]: I1205 00:25:14.155087 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:14 crc kubenswrapper[4759]: E1205 00:25:14.155335 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:14 crc kubenswrapper[4759]: E1205 00:25:14.155405 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:15 crc kubenswrapper[4759]: I1205 00:25:15.154916 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:15 crc kubenswrapper[4759]: I1205 00:25:15.154965 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:15 crc kubenswrapper[4759]: E1205 00:25:15.155135 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 00:25:15 crc kubenswrapper[4759]: E1205 00:25:15.155365 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 00:25:16 crc kubenswrapper[4759]: I1205 00:25:16.155037 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:16 crc kubenswrapper[4759]: I1205 00:25:16.155052 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:16 crc kubenswrapper[4759]: E1205 00:25:16.155190 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ksxg9" podUID="f6ca2f36-241c-41cb-9d1d-d6856e819953" Dec 05 00:25:16 crc kubenswrapper[4759]: E1205 00:25:16.155336 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 00:25:17 crc kubenswrapper[4759]: I1205 00:25:17.154968 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:17 crc kubenswrapper[4759]: I1205 00:25:17.155007 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:17 crc kubenswrapper[4759]: I1205 00:25:17.157716 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 00:25:17 crc kubenswrapper[4759]: I1205 00:25:17.157940 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.036435 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.155262 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.155265 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.157790 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.158450 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.158478 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 00:25:18 crc kubenswrapper[4759]: I1205 00:25:18.158608 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.464641 4759 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.519600 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.520113 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.523400 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.524146 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.525516 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.525659 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.525712 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.525817 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.525854 4759 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.525953 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.526044 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.526113 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.526064 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.526182 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.525788 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.526250 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.526481 4759 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.526559 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.528110 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.528957 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.529975 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.530365 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.530750 4759 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.530817 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.530964 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: W1205 00:25:20.531539 4759 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.531638 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.531541 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: E1205 00:25:20.531648 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.532069 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.540018 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.544928 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.574508 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.575367 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.577218 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.577900 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.577934 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.578636 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.580239 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppbrk"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.580773 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.581204 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.581790 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.583389 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ntfxg"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.584051 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.584052 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dtbm8"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.584935 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.585498 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.585601 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.585770 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtldn"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.586162 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593090 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593269 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593505 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593679 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593796 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.593859 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.594090 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.594190 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.594422 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.594680 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.594815 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gng7x"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.595156 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.596531 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.596669 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.596753 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597082 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597131 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597287 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597560 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597572 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597737 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597884 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597948 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.597974 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.598746 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.599058 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.599187 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.599381 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.599932 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.600458 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.603887 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.604707 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.604831 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.604940 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.605106 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.605927 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606115 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606250 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606402 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606523 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606642 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606755 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606914 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606966 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.606985 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607067 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607187 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607298 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607340 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607478 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607327 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607487 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.607727 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.611442 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.612504 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wf4wz"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.612785 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.613257 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.613747 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.614116 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.626903 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.647729 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.648973 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.649057 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.649243 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.649661 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.650184 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29414880-vdbj2"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.650706 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.651777 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653375 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653491 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653580 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653584 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653675 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653791 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.653898 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654044 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654124 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654158 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654278 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654407 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.654499 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655097 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655235 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655399 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655413 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655508 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655626 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655673 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655702 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655776 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655838 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655848 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.655907 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.656097 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.656224 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.656394 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.657747 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.663064 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.663777 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.664031 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.663186 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.664359 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.664535 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.664629 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.666166 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.666777 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.667338 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.669849 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mhbwk"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.670558 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.670901 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.673576 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.676059 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.676355 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.677898 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.678736 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.680179 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.680478 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.681836 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.682095 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.686458 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p7g7"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.687153 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.687241 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.687816 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.689669 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.690407 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.690715 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.691412 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.694197 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.694980 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.697239 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.703871 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.706363 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.706677 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.711114 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.723031 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.724132 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.724724 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-drwnh"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.725252 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.726533 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.728429 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.728542 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9c5t7"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.729256 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.729928 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwwmq"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.730880 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.731398 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.735186 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.735333 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.738111 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.738193 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppbrk"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.738261 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9pz94"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.738572 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.738969 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.742390 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f9knl"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748704 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748754 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxmt\" (UniqueName: \"kubernetes.io/projected/08f680e4-29bc-4ffc-962b-1a3151e5e41f-kube-api-access-dpxmt\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748774 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748797 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflp6\" (UniqueName: \"kubernetes.io/projected/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-kube-api-access-fflp6\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748829 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8hb\" (UniqueName: \"kubernetes.io/projected/d5488f06-06a1-48b6-9103-abff66383776-kube-api-access-xc8hb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748859 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-images\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748879 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b40743-a414-4dd8-9613-0bc14b937e3d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748903 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47zt9\" (UniqueName: \"kubernetes.io/projected/f2b40743-a414-4dd8-9613-0bc14b937e3d-kube-api-access-47zt9\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748928 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5488f06-06a1-48b6-9103-abff66383776-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748954 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/08f680e4-29bc-4ffc-962b-1a3151e5e41f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.748983 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5488f06-06a1-48b6-9103-abff66383776-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.749002 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-config\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.751017 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.751384 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.754156 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.763410 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.764548 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.764670 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.767746 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9r6hq"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.768828 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.777432 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6hsqf"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.778857 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dtbm8"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.778996 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.779194 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.784691 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.785870 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.786975 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.793144 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wf4wz"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.796367 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.797829 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.799008 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.800085 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtldn"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.801688 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gng7x"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.802698 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.803088 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.804197 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ntfxg"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.805955 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.807133 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.809767 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.811379 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.812636 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29414880-vdbj2"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.813901 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9c5t7"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.815154 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.816424 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.817737 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.819026 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-drwnh"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.820689 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.822217 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.823365 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.824348 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p7g7"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.829182 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwwmq"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.833789 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6hsqf"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.836356 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.840672 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9pz94"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.842279 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.843691 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9knl"] Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850006 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxmt\" (UniqueName: \"kubernetes.io/projected/08f680e4-29bc-4ffc-962b-1a3151e5e41f-kube-api-access-dpxmt\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850050 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850077 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflp6\" (UniqueName: \"kubernetes.io/projected/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-kube-api-access-fflp6\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850113 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8hb\" (UniqueName: \"kubernetes.io/projected/d5488f06-06a1-48b6-9103-abff66383776-kube-api-access-xc8hb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850148 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-images\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850170 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b40743-a414-4dd8-9613-0bc14b937e3d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850195 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47zt9\" (UniqueName: \"kubernetes.io/projected/f2b40743-a414-4dd8-9613-0bc14b937e3d-kube-api-access-47zt9\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850223 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5488f06-06a1-48b6-9103-abff66383776-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850252 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/08f680e4-29bc-4ffc-962b-1a3151e5e41f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850288 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5488f06-06a1-48b6-9103-abff66383776-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850333 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-config\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.850368 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.851112 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-images\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.851484 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.852180 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5488f06-06a1-48b6-9103-abff66383776-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.855912 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5488f06-06a1-48b6-9103-abff66383776-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.856099 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-serving-cert\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.856706 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b40743-a414-4dd8-9613-0bc14b937e3d-config\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.858821 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b40743-a414-4dd8-9613-0bc14b937e3d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.861162 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/08f680e4-29bc-4ffc-962b-1a3151e5e41f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.862089 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.885921 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.902179 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.921793 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.944925 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 00:25:20 crc kubenswrapper[4759]: I1205 00:25:20.962717 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.002197 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.022140 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.041971 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052187 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052221 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052261 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052277 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840df22-cb55-402a-9138-567bcdae100c-config\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052294 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-config\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052354 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b55v9\" (UniqueName: \"kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052374 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052390 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052404 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-service-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052447 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40f30299-1808-43a6-83db-44e27fa0b18e-metrics-tls\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052466 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052495 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a888f59-c21c-4786-8a70-cdabfba7a293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052517 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052531 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052562 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398957bf-f56a-4d8c-8e79-73bc19356c88-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052577 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-policies\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052599 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052613 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-encryption-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052645 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052661 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j8bl\" (UniqueName: \"kubernetes.io/projected/415154e7-28be-49ac-954d-88342198e56e-kube-api-access-4j8bl\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052676 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-config\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052692 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv74b\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-kube-api-access-hv74b\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052707 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052721 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8g9\" (UniqueName: \"kubernetes.io/projected/9a888f59-c21c-4786-8a70-cdabfba7a293-kube-api-access-px8g9\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052736 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-etcd-client\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlgjg\" (UniqueName: \"kubernetes.io/projected/adb9d332-b13b-456d-9d04-32124d387a36-kube-api-access-zlgjg\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052765 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052780 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/398957bf-f56a-4d8c-8e79-73bc19356c88-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052795 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-config\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052810 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052825 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-dir\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052839 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052855 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052870 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052891 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-audit\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052907 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-serving-cert\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052939 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rcs\" (UniqueName: \"kubernetes.io/projected/f27f029e-62d3-497f-8f69-a95229ebe945-kube-api-access-24rcs\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052953 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052968 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052984 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.052999 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-trusted-ca\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053014 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6840df22-cb55-402a-9138-567bcdae100c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053028 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053044 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-image-import-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415154e7-28be-49ac-954d-88342198e56e-serving-cert\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053073 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbx6\" (UniqueName: \"kubernetes.io/projected/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-kube-api-access-ntbx6\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053089 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-serving-cert\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053105 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-encryption-config\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053120 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053136 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053151 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053165 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a888f59-c21c-4786-8a70-cdabfba7a293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053179 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053194 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053208 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053224 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbswj\" (UniqueName: \"kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053240 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053254 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6840df22-cb55-402a-9138-567bcdae100c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053268 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-etcd-serving-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053285 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053301 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76k4t\" (UniqueName: \"kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053349 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053368 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053387 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhpm\" (UniqueName: \"kubernetes.io/projected/f99bc61a-b820-4ebd-8ed0-d18cba6c017a-kube-api-access-dfhpm\") pod \"downloads-7954f5f757-wf4wz\" (UID: \"f99bc61a-b820-4ebd-8ed0-d18cba6c017a\") " pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053403 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-audit-dir\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053421 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krg75\" (UniqueName: \"kubernetes.io/projected/40f30299-1808-43a6-83db-44e27fa0b18e-kube-api-access-krg75\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053466 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtzd\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053479 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-client\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053492 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053505 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053520 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-node-pullsecrets\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.053543 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27f029e-62d3-497f-8f69-a95229ebe945-serving-cert\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.053803 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.553793182 +0000 UTC m=+140.769454132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.061619 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.082661 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.102434 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.122632 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.141989 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.154876 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155114 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78n8t\" (UniqueName: \"kubernetes.io/projected/ff68e70c-0561-4324-8cd4-1c8897cff45b-kube-api-access-78n8t\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155155 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.155208 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.655177492 +0000 UTC m=+140.870838442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155277 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155344 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155371 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-csi-data-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155412 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5p86\" (UniqueName: \"kubernetes.io/projected/2183c13d-b67c-403a-8723-8c62e3ad57f3-kube-api-access-t5p86\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155430 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1ec533-684e-4f58-8861-adc357bf448e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155584 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398957bf-f56a-4d8c-8e79-73bc19356c88-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155626 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-encryption-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155654 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-config\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155814 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155842 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j8bl\" (UniqueName: \"kubernetes.io/projected/415154e7-28be-49ac-954d-88342198e56e-kube-api-access-4j8bl\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.155978 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h7jx\" (UniqueName: \"kubernetes.io/projected/fc1ec533-684e-4f58-8861-adc357bf448e-kube-api-access-9h7jx\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156003 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6sm\" (UniqueName: \"kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156114 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156140 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156216 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8g9\" (UniqueName: \"kubernetes.io/projected/9a888f59-c21c-4786-8a70-cdabfba7a293-kube-api-access-px8g9\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156248 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlgjg\" (UniqueName: \"kubernetes.io/projected/adb9d332-b13b-456d-9d04-32124d387a36-kube-api-access-zlgjg\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156281 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmss\" (UniqueName: \"kubernetes.io/projected/dc43ed3f-937d-4de9-9e4a-301788d5d19d-kube-api-access-tcmss\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156580 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-config\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157139 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-dir\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157300 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c369f5e-c51c-46f0-a184-7e9c627451f4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157450 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157533 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/03b74b39-100a-4fe4-8bae-0f2088728e24-tmpfs\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157225 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-dir\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157363 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157398 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157470 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-config\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157650 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.157108 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.156783 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/398957bf-f56a-4d8c-8e79-73bc19356c88-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158046 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-audit\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158148 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-plugins-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158565 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrp6\" (UniqueName: \"kubernetes.io/projected/bbbc5689-2333-4227-984e-57e82e237746-kube-api-access-ctrp6\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158680 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff68e70c-0561-4324-8cd4-1c8897cff45b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158225 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-audit\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158523 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-config\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158833 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158969 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-profile-collector-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.158912 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-encryption-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159053 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159191 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415154e7-28be-49ac-954d-88342198e56e-serving-cert\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159365 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zm7\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-kube-api-access-w6zm7\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159779 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159632 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.159930 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c369f5e-c51c-46f0-a184-7e9c627451f4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160041 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwtbr\" (UniqueName: \"kubernetes.io/projected/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-kube-api-access-nwtbr\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160140 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2183c13d-b67c-403a-8723-8c62e3ad57f3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160247 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwvd\" (UniqueName: \"kubernetes.io/projected/cb7e479b-65fb-45dc-bf4b-cda530317c77-kube-api-access-czwvd\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160370 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-encryption-config\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160555 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160632 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160708 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160807 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a888f59-c21c-4786-8a70-cdabfba7a293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160885 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9fg\" (UniqueName: \"kubernetes.io/projected/03b74b39-100a-4fe4-8bae-0f2088728e24-kube-api-access-nb9fg\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.160959 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.161063 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.161145 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-default-certificate\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.161217 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-service-ca-bundle\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.161327 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-socket-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.161473 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6840df22-cb55-402a-9138-567bcdae100c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162242 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162247 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-etcd-serving-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-webhook-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162452 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162490 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a888f59-c21c-4786-8a70-cdabfba7a293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.162187 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/415154e7-28be-49ac-954d-88342198e56e-serving-cert\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163025 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76k4t\" (UniqueName: \"kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1ec533-684e-4f58-8861-adc357bf448e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163520 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-etcd-serving-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163680 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163769 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.163852 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.164278 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-encryption-config\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.164966 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.165995 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168362 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-registration-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168406 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168429 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168445 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-apiservice-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168466 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-images\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168483 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f721b883-a5eb-4ecb-9565-e03cdb24c368-config\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168527 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhpm\" (UniqueName: \"kubernetes.io/projected/f99bc61a-b820-4ebd-8ed0-d18cba6c017a-kube-api-access-dfhpm\") pod \"downloads-7954f5f757-wf4wz\" (UID: \"f99bc61a-b820-4ebd-8ed0-d18cba6c017a\") " pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168579 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krg75\" (UniqueName: \"kubernetes.io/projected/40f30299-1808-43a6-83db-44e27fa0b18e-kube-api-access-krg75\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168598 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168644 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4zf6\" (UniqueName: \"kubernetes.io/projected/b9d750eb-071c-4580-b626-26b375e56870-kube-api-access-v4zf6\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168663 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.168679 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f721b883-a5eb-4ecb-9565-e03cdb24c368-serving-cert\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169277 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169639 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgk8q\" (UniqueName: \"kubernetes.io/projected/86a28668-36ac-4656-b1bb-1fd68c51e6de-kube-api-access-tgk8q\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169737 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27f029e-62d3-497f-8f69-a95229ebe945-serving-cert\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169840 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-client\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169932 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.169966 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170070 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-node-pullsecrets\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170141 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bgw\" (UniqueName: \"kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170176 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170221 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-node-pullsecrets\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170549 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170710 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170853 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-config\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.170985 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b55v9\" (UniqueName: \"kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171093 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dth5r\" (UniqueName: \"kubernetes.io/projected/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-kube-api-access-dth5r\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171221 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bbbc5689-2333-4227-984e-57e82e237746-machine-approver-tls\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171357 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171468 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2031118b-50a5-4d09-afab-37e601a1631e-metrics-tls\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171569 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171693 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-metrics-certs\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171792 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171063 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171570 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-config\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.171804 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172028 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqt7\" (UniqueName: \"kubernetes.io/projected/09216400-991e-47e9-8494-ff24c8968a33-kube-api-access-mrqt7\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172049 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5b2s\" (UniqueName: \"kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172086 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a888f59-c21c-4786-8a70-cdabfba7a293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172106 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99r7\" (UniqueName: \"kubernetes.io/projected/22760a3a-a198-4a84-8de8-20a745b3cb30-kube-api-access-x99r7\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172124 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tzs\" (UniqueName: \"kubernetes.io/projected/0206012f-d861-4f45-9dfd-3923117fea31-kube-api-access-c2tzs\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172164 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-serving-cert\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172186 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172202 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq8bg\" (UniqueName: \"kubernetes.io/projected/f721b883-a5eb-4ecb-9565-e03cdb24c368-kube-api-access-wq8bg\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172249 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-policies\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172266 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172284 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172321 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv74b\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-kube-api-access-hv74b\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172342 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-etcd-client\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172364 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntp6z\" (UniqueName: \"kubernetes.io/projected/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-kube-api-access-ntp6z\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172387 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmc6\" (UniqueName: \"kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172403 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-client\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172421 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172439 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/398957bf-f56a-4d8c-8e79-73bc19356c88-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172456 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172495 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172512 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-key\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172530 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-service-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172546 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172568 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172585 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-srv-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172611 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172634 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-serving-cert\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172657 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rcs\" (UniqueName: \"kubernetes.io/projected/f27f029e-62d3-497f-8f69-a95229ebe945-kube-api-access-24rcs\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172676 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172694 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172718 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp58s\" (UniqueName: \"kubernetes.io/projected/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-kube-api-access-rp58s\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172752 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172778 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172799 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-audit-policies\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172801 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-stats-auth\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172856 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-trusted-ca\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172880 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6840df22-cb55-402a-9138-567bcdae100c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172902 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-image-import-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172927 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-mountpoint-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172956 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbx6\" (UniqueName: \"kubernetes.io/projected/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-kube-api-access-ntbx6\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.172994 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-serving-cert\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173020 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-srv-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173047 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173072 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173134 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173160 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbswj\" (UniqueName: \"kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173184 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173211 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173257 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173284 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173328 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-proxy-tls\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173359 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-audit-dir\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173383 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff68e70c-0561-4324-8cd4-1c8897cff45b-proxy-tls\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173406 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-config\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173428 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4bz\" (UniqueName: \"kubernetes.io/projected/11f20e7f-9d7e-4574-928a-42697c2fdb81-kube-api-access-9x4bz\") pod \"migrator-59844c95c7-6d754\" (UID: \"11f20e7f-9d7e-4574-928a-42697c2fdb81\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173465 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173506 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173540 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtzd\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173576 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173614 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173659 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5k9\" (UniqueName: \"kubernetes.io/projected/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-kube-api-access-xv5k9\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173696 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173738 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c369f5e-c51c-46f0-a184-7e9c627451f4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173805 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173849 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173902 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840df22-cb55-402a-9138-567bcdae100c-config\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173936 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.173970 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2031118b-50a5-4d09-afab-37e601a1631e-trusted-ca\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174004 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-auth-proxy-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174046 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174094 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-service-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adb9d332-b13b-456d-9d04-32124d387a36-audit-dir\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.174640 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.674624075 +0000 UTC m=+140.890285115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174731 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.174899 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-image-import-ca\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.175199 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a888f59-c21c-4786-8a70-cdabfba7a293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.175462 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-etcd-client\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.175914 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-config\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.175969 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-etcd-client\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176054 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f029e-62d3-497f-8f69-a95229ebe945-trusted-ca\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176148 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40f30299-1808-43a6-83db-44e27fa0b18e-metrics-tls\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176172 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176080 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176305 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176518 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.176989 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.177242 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415154e7-28be-49ac-954d-88342198e56e-service-ca-bundle\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.178033 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb9d332-b13b-456d-9d04-32124d387a36-serving-cert\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.178101 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/398957bf-f56a-4d8c-8e79-73bc19356c88-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.178718 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840df22-cb55-402a-9138-567bcdae100c-config\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.178396 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.179548 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.180057 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adb9d332-b13b-456d-9d04-32124d387a36-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.180124 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6840df22-cb55-402a-9138-567bcdae100c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.180178 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.180237 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40f30299-1808-43a6-83db-44e27fa0b18e-metrics-tls\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.180915 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f27f029e-62d3-497f-8f69-a95229ebe945-serving-cert\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.181347 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-serving-cert\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.181727 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.202707 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.221954 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.242170 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.270038 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.276717 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.276860 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.776836554 +0000 UTC m=+140.992497504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.277259 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c369f5e-c51c-46f0-a184-7e9c627451f4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.277380 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/03b74b39-100a-4fe4-8bae-0f2088728e24-tmpfs\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.277847 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-plugins-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.277979 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrp6\" (UniqueName: \"kubernetes.io/projected/bbbc5689-2333-4227-984e-57e82e237746-kube-api-access-ctrp6\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278092 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff68e70c-0561-4324-8cd4-1c8897cff45b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278170 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-profile-collector-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278286 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278377 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zm7\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-kube-api-access-w6zm7\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c369f5e-c51c-46f0-a184-7e9c627451f4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278575 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwtbr\" (UniqueName: \"kubernetes.io/projected/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-kube-api-access-nwtbr\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278090 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/03b74b39-100a-4fe4-8bae-0f2088728e24-tmpfs\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2183c13d-b67c-403a-8723-8c62e3ad57f3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278841 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwvd\" (UniqueName: \"kubernetes.io/projected/cb7e479b-65fb-45dc-bf4b-cda530317c77-kube-api-access-czwvd\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278935 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9fg\" (UniqueName: \"kubernetes.io/projected/03b74b39-100a-4fe4-8bae-0f2088728e24-kube-api-access-nb9fg\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279018 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279091 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-default-certificate\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279020 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-service-ca-bundle\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279405 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-webhook-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278180 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-plugins-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.278961 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff68e70c-0561-4324-8cd4-1c8897cff45b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279618 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-socket-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279696 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-socket-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279830 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1ec533-684e-4f58-8861-adc357bf448e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.279952 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-registration-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280056 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-apiservice-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280157 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-images\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280262 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f721b883-a5eb-4ecb-9565-e03cdb24c368-config\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280394 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280511 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4zf6\" (UniqueName: \"kubernetes.io/projected/b9d750eb-071c-4580-b626-26b375e56870-kube-api-access-v4zf6\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280622 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280740 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgk8q\" (UniqueName: \"kubernetes.io/projected/86a28668-36ac-4656-b1bb-1fd68c51e6de-kube-api-access-tgk8q\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281405 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f721b883-a5eb-4ecb-9565-e03cdb24c368-serving-cert\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281445 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bgw\" (UniqueName: \"kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280463 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-service-ca-bundle\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281466 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.280578 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-registration-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281648 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dth5r\" (UniqueName: \"kubernetes.io/projected/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-kube-api-access-dth5r\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281708 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bbbc5689-2333-4227-984e-57e82e237746-machine-approver-tls\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281731 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-metrics-certs\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2031118b-50a5-4d09-afab-37e601a1631e-metrics-tls\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281787 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281809 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5b2s\" (UniqueName: \"kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281884 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282021 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2183c13d-b67c-403a-8723-8c62e3ad57f3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.281907 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqt7\" (UniqueName: \"kubernetes.io/projected/09216400-991e-47e9-8494-ff24c8968a33-kube-api-access-mrqt7\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282187 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99r7\" (UniqueName: \"kubernetes.io/projected/22760a3a-a198-4a84-8de8-20a745b3cb30-kube-api-access-x99r7\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282212 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tzs\" (UniqueName: \"kubernetes.io/projected/0206012f-d861-4f45-9dfd-3923117fea31-kube-api-access-c2tzs\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282266 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-serving-cert\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282295 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282340 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq8bg\" (UniqueName: \"kubernetes.io/projected/f721b883-a5eb-4ecb-9565-e03cdb24c368-kube-api-access-wq8bg\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282424 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmc6\" (UniqueName: \"kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282458 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntp6z\" (UniqueName: \"kubernetes.io/projected/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-kube-api-access-ntp6z\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282497 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-key\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282515 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-client\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-service-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282553 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282568 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282584 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-srv-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282608 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282625 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp58s\" (UniqueName: \"kubernetes.io/projected/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-kube-api-access-rp58s\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-stats-auth\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282667 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282684 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-mountpoint-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282712 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-srv-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282726 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282744 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282778 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282793 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-proxy-tls\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282818 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-config\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282834 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4bz\" (UniqueName: \"kubernetes.io/projected/11f20e7f-9d7e-4574-928a-42697c2fdb81-kube-api-access-9x4bz\") pod \"migrator-59844c95c7-6d754\" (UID: \"11f20e7f-9d7e-4574-928a-42697c2fdb81\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282850 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff68e70c-0561-4324-8cd4-1c8897cff45b-proxy-tls\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282866 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282881 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282897 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5k9\" (UniqueName: \"kubernetes.io/projected/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-kube-api-access-xv5k9\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282912 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282929 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282959 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c369f5e-c51c-46f0-a184-7e9c627451f4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.282982 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283703 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-mountpoint-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283784 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2031118b-50a5-4d09-afab-37e601a1631e-trusted-ca\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283837 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-auth-proxy-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283878 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78n8t\" (UniqueName: \"kubernetes.io/projected/ff68e70c-0561-4324-8cd4-1c8897cff45b-kube-api-access-78n8t\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283946 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-csi-data-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.283976 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5p86\" (UniqueName: \"kubernetes.io/projected/2183c13d-b67c-403a-8723-8c62e3ad57f3-kube-api-access-t5p86\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284006 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1ec533-684e-4f58-8861-adc357bf448e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284046 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h7jx\" (UniqueName: \"kubernetes.io/projected/fc1ec533-684e-4f58-8861-adc357bf448e-kube-api-access-9h7jx\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284077 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6sm\" (UniqueName: \"kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284108 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284159 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmss\" (UniqueName: \"kubernetes.io/projected/dc43ed3f-937d-4de9-9e4a-301788d5d19d-kube-api-access-tcmss\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.284168 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-webhook-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.285026 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-default-certificate\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.285115 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03b74b39-100a-4fe4-8bae-0f2088728e24-apiservice-cert\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.287941 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-metrics-certs\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.288204 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.288230 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/09216400-991e-47e9-8494-ff24c8968a33-csi-data-dir\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.288510 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1ec533-684e-4f58-8861-adc357bf448e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.288936 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1ec533-684e-4f58-8861-adc357bf448e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.288943 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.788921649 +0000 UTC m=+141.004582589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.289179 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.289911 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.290234 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.290954 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.296854 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-stats-auth\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.303074 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.322015 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.327841 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.342060 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.345561 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.351235 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-profile-collector-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.351982 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.361678 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.369353 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff68e70c-0561-4324-8cd4-1c8897cff45b-proxy-tls\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.382448 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.384771 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.384887 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.884864386 +0000 UTC m=+141.100525336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.385184 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.385495 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.885486561 +0000 UTC m=+141.101147511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.402786 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.406564 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-config\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.422353 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.442610 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.456730 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-serving-cert\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.462249 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.470844 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-client\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.482093 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.483588 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.488437 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.488586 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.988572012 +0000 UTC m=+141.204232972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.489209 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.489733 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:21.98971448 +0000 UTC m=+141.205375440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.502416 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.508963 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a28668-36ac-4656-b1bb-1fd68c51e6de-etcd-service-ca\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.522710 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.543091 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.562569 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.574052 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c369f5e-c51c-46f0-a184-7e9c627451f4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.582235 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.590678 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.590850 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.090827463 +0000 UTC m=+141.306488423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.591401 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.591875 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.091855287 +0000 UTC m=+141.307516237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.602103 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.608570 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c369f5e-c51c-46f0-a184-7e9c627451f4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.623108 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.642257 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.655737 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2031118b-50a5-4d09-afab-37e601a1631e-metrics-tls\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.663226 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.682968 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.692141 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.692386 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.192365946 +0000 UTC m=+141.408026906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.692876 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.693226 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.193215947 +0000 UTC m=+141.408876907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.701027 4759 request.go:700] Waited for 1.009566131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dtrusted-ca&limit=500&resourceVersion=0 Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.712793 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.718788 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2031118b-50a5-4d09-afab-37e601a1631e-trusted-ca\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.721774 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.742156 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.762853 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.782378 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.793894 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.794982 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.294966705 +0000 UTC m=+141.510627655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.801816 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.810901 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b9d750eb-071c-4580-b626-26b375e56870-srv-cert\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.821706 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.828060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-srv-cert\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.850644 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.851422 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-images\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.862466 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.883549 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.891223 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-proxy-tls\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.896825 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.897751 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.397739928 +0000 UTC m=+141.613400868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.902224 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.922933 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.943396 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.956269 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bbbc5689-2333-4227-984e-57e82e237746-machine-approver-tls\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.962940 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.982066 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.991511 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.998243 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.998440 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.49840993 +0000 UTC m=+141.714070880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:21 crc kubenswrapper[4759]: I1205 00:25:21.998901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:21 crc kubenswrapper[4759]: E1205 00:25:21.999230 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.49922191 +0000 UTC m=+141.714882860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.003076 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.008756 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbbc5689-2333-4227-984e-57e82e237746-auth-proxy-config\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.023533 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.042902 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.052233 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f721b883-a5eb-4ecb-9565-e03cdb24c368-config\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.063115 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.082890 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.095208 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f721b883-a5eb-4ecb-9565-e03cdb24c368-serving-cert\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.100349 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.100673 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.60063778 +0000 UTC m=+141.816298790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.101537 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.101968 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.601946912 +0000 UTC m=+141.817607932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.102145 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.123070 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.143595 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.151194 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.163252 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.171398 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.171505 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.671470155 +0000 UTC m=+141.887131145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.172644 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.172678 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.672669875 +0000 UTC m=+141.888330825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.172772 4759 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.172864 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.672840149 +0000 UTC m=+141.888501149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.182563 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.182675 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.682653448 +0000 UTC m=+141.898314398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.182707 4759 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.182752 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert podName:0ae425cf-9dc3-471e-a2b8-506eedb29c8d nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.68274132 +0000 UTC m=+141.898402270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert") pod "controller-manager-879f6c89f-7hqjc" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183558 4759 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183655 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.683628002 +0000 UTC m=+141.899288992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183706 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183770 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.683749914 +0000 UTC m=+141.899410904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183856 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.183928 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.683908568 +0000 UTC m=+141.899569558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.184861 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.188424 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.201544 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.202723 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.202806 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.702789689 +0000 UTC m=+141.918450639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.203047 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.203325 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.703297981 +0000 UTC m=+141.918958931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.221981 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.227469 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.242268 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.262939 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.271052 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-key\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281492 4759 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281495 4759 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281600 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs podName:dc43ed3f-937d-4de9-9e4a-301788d5d19d nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.781578727 +0000 UTC m=+141.997239747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs") pod "machine-config-server-9r6hq" (UID: "dc43ed3f-937d-4de9-9e4a-301788d5d19d") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281675 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert podName:0206012f-d861-4f45-9dfd-3923117fea31 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.781641069 +0000 UTC m=+141.997302099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert") pod "ingress-canary-9pz94" (UID: "0206012f-d861-4f45-9dfd-3923117fea31") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281760 4759 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.281860 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle podName:22760a3a-a198-4a84-8de8-20a745b3cb30 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.781835153 +0000 UTC m=+141.997496143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle") pod "service-ca-9c57cc56f-9c5t7" (UID: "22760a3a-a198-4a84-8de8-20a745b3cb30") : failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.283081 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285248 4759 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285285 4759 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285268 4759 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285347 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token podName:dc43ed3f-937d-4de9-9e4a-301788d5d19d nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.785335899 +0000 UTC m=+142.000996849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token") pod "machine-config-server-9r6hq" (UID: "dc43ed3f-937d-4de9-9e4a-301788d5d19d") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285368 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls podName:70145bc3-1bf0-48e3-ab5d-84b0e04dfd69 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.78536023 +0000 UTC m=+142.001021180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls") pod "dns-default-f9knl" (UID: "70145bc3-1bf0-48e3-ab5d-84b0e04dfd69") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.285388 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume podName:70145bc3-1bf0-48e3-ab5d-84b0e04dfd69 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.78538085 +0000 UTC m=+142.001041800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume") pod "dns-default-f9knl" (UID: "70145bc3-1bf0-48e3-ab5d-84b0e04dfd69") : failed to sync configmap cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.287510 4759 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.287588 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs podName:cb7e479b-65fb-45dc-bf4b-cda530317c77 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.787572573 +0000 UTC m=+142.003233603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs") pod "multus-admission-controller-857f4d67dd-wwwmq" (UID: "cb7e479b-65fb-45dc-bf4b-cda530317c77") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.301904 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.304408 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.304566 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.804548276 +0000 UTC m=+142.020209236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.304753 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.305104 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.80508994 +0000 UTC m=+142.020750890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.322390 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.342357 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.363808 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.382522 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.402497 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.405741 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.405916 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.905894746 +0000 UTC m=+142.121555706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.406540 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.406897 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:22.9068858 +0000 UTC m=+142.122546750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.422499 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.443158 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.463165 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.482266 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.503088 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.507725 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.508063 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.008036724 +0000 UTC m=+142.223697674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.508490 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.508940 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.008919854 +0000 UTC m=+142.224580824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.522239 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.542647 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.563133 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.582100 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.602564 4759 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.609178 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.609366 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.109347151 +0000 UTC m=+142.325008111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.609562 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.609990 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.109979477 +0000 UTC m=+142.325640427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.622976 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.686708 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxmt\" (UniqueName: \"kubernetes.io/projected/08f680e4-29bc-4ffc-962b-1a3151e5e41f-kube-api-access-dpxmt\") pod \"cluster-samples-operator-665b6dd947-kcxjh\" (UID: \"08f680e4-29bc-4ffc-962b-1a3151e5e41f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.700780 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflp6\" (UniqueName: \"kubernetes.io/projected/02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4-kube-api-access-fflp6\") pod \"openshift-config-operator-7777fb866f-qlgnw\" (UID: \"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711215 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711369 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711461 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711478 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711527 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711550 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711573 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711633 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.711686 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.711907 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.211880078 +0000 UTC m=+142.427541068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.720490 4759 request.go:700] Waited for 1.863622243s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.723072 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8hb\" (UniqueName: \"kubernetes.io/projected/d5488f06-06a1-48b6-9103-abff66383776-kube-api-access-xc8hb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rzm5t\" (UID: \"d5488f06-06a1-48b6-9103-abff66383776\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.741274 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47zt9\" (UniqueName: \"kubernetes.io/projected/f2b40743-a414-4dd8-9613-0bc14b937e3d-kube-api-access-47zt9\") pod \"machine-api-operator-5694c8668f-gng7x\" (UID: \"f2b40743-a414-4dd8-9613-0bc14b937e3d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.757419 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j8bl\" (UniqueName: \"kubernetes.io/projected/415154e7-28be-49ac-954d-88342198e56e-kube-api-access-4j8bl\") pod \"authentication-operator-69f744f599-dtbm8\" (UID: \"415154e7-28be-49ac-954d-88342198e56e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.777638 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlgjg\" (UniqueName: \"kubernetes.io/projected/adb9d332-b13b-456d-9d04-32124d387a36-kube-api-access-zlgjg\") pod \"apiserver-76f77b778f-ntfxg\" (UID: \"adb9d332-b13b-456d-9d04-32124d387a36\") " pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.780051 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.802742 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8g9\" (UniqueName: \"kubernetes.io/projected/9a888f59-c21c-4786-8a70-cdabfba7a293-kube-api-access-px8g9\") pod \"openshift-apiserver-operator-796bbdcf4f-5b4pd\" (UID: \"9a888f59-c21c-4786-8a70-cdabfba7a293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.803063 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.809326 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813107 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813207 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813262 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813555 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813589 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813700 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.813731 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.814204 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.31419055 +0000 UTC m=+142.529851500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.814598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22760a3a-a198-4a84-8de8-20a745b3cb30-signing-cabundle\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.814838 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-config-volume\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.818027 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-metrics-tls\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.818319 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-certs\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.818377 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc43ed3f-937d-4de9-9e4a-301788d5d19d-node-bootstrap-token\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.818721 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb7e479b-65fb-45dc-bf4b-cda530317c77-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.821117 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0206012f-d861-4f45-9dfd-3923117fea31-cert\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.821611 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6840df22-cb55-402a-9138-567bcdae100c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vm2fd\" (UID: \"6840df22-cb55-402a-9138-567bcdae100c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.858660 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhpm\" (UniqueName: \"kubernetes.io/projected/f99bc61a-b820-4ebd-8ed0-d18cba6c017a-kube-api-access-dfhpm\") pod \"downloads-7954f5f757-wf4wz\" (UID: \"f99bc61a-b820-4ebd-8ed0-d18cba6c017a\") " pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.874048 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.900778 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.901844 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.905652 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krg75\" (UniqueName: \"kubernetes.io/projected/40f30299-1808-43a6-83db-44e27fa0b18e-kube-api-access-krg75\") pod \"dns-operator-744455d44c-ppbrk\" (UID: \"40f30299-1808-43a6-83db-44e27fa0b18e\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.914563 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.914716 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.414694238 +0000 UTC m=+142.630355188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.914916 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: E1205 00:25:22.915606 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.415594731 +0000 UTC m=+142.631255681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.918285 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv74b\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-kube-api-access-hv74b\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.924666 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.932115 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b55v9\" (UniqueName: \"kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9\") pod \"console-f9d7485db-g8mxq\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.942367 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.963979 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56536c3b-13f4-4ead-a2e4-2c30a87ef64c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hzds\" (UID: \"56536c3b-13f4-4ead-a2e4-2c30a87ef64c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.970419 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:22 crc kubenswrapper[4759]: I1205 00:25:22.979640 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/398957bf-f56a-4d8c-8e79-73bc19356c88-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-slm7z\" (UID: \"398957bf-f56a-4d8c-8e79-73bc19356c88\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.014153 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.016328 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.016458 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.017146 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.517127473 +0000 UTC m=+142.732788423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.048035 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtzd\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.052212 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rcs\" (UniqueName: \"kubernetes.io/projected/f27f029e-62d3-497f-8f69-a95229ebe945-kube-api-access-24rcs\") pod \"console-operator-58897d9998-vtldn\" (UID: \"f27f029e-62d3-497f-8f69-a95229ebe945\") " pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.053422 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbswj\" (UniqueName: \"kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.056880 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbx6\" (UniqueName: \"kubernetes.io/projected/ca2cc56f-b0a3-4e49-a9dd-c8918810f423-kube-api-access-ntbx6\") pod \"apiserver-7bbb656c7d-nljw2\" (UID: \"ca2cc56f-b0a3-4e49-a9dd-c8918810f423\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.062710 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.084220 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c369f5e-c51c-46f0-a184-7e9c627451f4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6fgg\" (UID: \"3c369f5e-c51c-46f0-a184-7e9c627451f4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.117137 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrp6\" (UniqueName: \"kubernetes.io/projected/bbbc5689-2333-4227-984e-57e82e237746-kube-api-access-ctrp6\") pod \"machine-approver-56656f9798-mqdv9\" (UID: \"bbbc5689-2333-4227-984e-57e82e237746\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.119243 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.119630 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.619619531 +0000 UTC m=+142.835280481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.123707 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zm7\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-kube-api-access-w6zm7\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.127181 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.133629 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.146744 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.146979 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwtbr\" (UniqueName: \"kubernetes.io/projected/392f2eff-cfc4-431e-83ad-a8c9328aa8a8-kube-api-access-nwtbr\") pod \"catalog-operator-68c6474976-fpl7m\" (UID: \"392f2eff-cfc4-431e-83ad-a8c9328aa8a8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.165882 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.176587 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9fg\" (UniqueName: \"kubernetes.io/projected/03b74b39-100a-4fe4-8bae-0f2088728e24-kube-api-access-nb9fg\") pod \"packageserver-d55dfcdfc-6tr79\" (UID: \"03b74b39-100a-4fe4-8bae-0f2088728e24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.191510 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwvd\" (UniqueName: \"kubernetes.io/projected/cb7e479b-65fb-45dc-bf4b-cda530317c77-kube-api-access-czwvd\") pod \"multus-admission-controller-857f4d67dd-wwwmq\" (UID: \"cb7e479b-65fb-45dc-bf4b-cda530317c77\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.204634 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4zf6\" (UniqueName: \"kubernetes.io/projected/b9d750eb-071c-4580-b626-26b375e56870-kube-api-access-v4zf6\") pod \"olm-operator-6b444d44fb-s8pqf\" (UID: \"b9d750eb-071c-4580-b626-26b375e56870\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.216729 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.222089 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgk8q\" (UniqueName: \"kubernetes.io/projected/86a28668-36ac-4656-b1bb-1fd68c51e6de-kube-api-access-tgk8q\") pod \"etcd-operator-b45778765-8p7g7\" (UID: \"86a28668-36ac-4656-b1bb-1fd68c51e6de\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.222580 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.223171 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.723154002 +0000 UTC m=+142.938814952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.241602 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bgw\" (UniqueName: \"kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw\") pod \"route-controller-manager-6576b87f9c-p6l5s\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.259747 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dth5r\" (UniqueName: \"kubernetes.io/projected/e8a2cc0c-f6ac-46c5-9d42-5859ec38b291-kube-api-access-dth5r\") pod \"machine-config-operator-74547568cd-sn2lw\" (UID: \"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.278657 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5b2s\" (UniqueName: \"kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s\") pod \"image-pruner-29414880-vdbj2\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.310913 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.329282 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.329612 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.829600615 +0000 UTC m=+143.045261565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.331837 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2031118b-50a5-4d09-afab-37e601a1631e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ftl6j\" (UID: \"2031118b-50a5-4d09-afab-37e601a1631e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.340999 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.344625 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqt7\" (UniqueName: \"kubernetes.io/projected/09216400-991e-47e9-8494-ff24c8968a33-kube-api-access-mrqt7\") pod \"csi-hostpathplugin-6hsqf\" (UID: \"09216400-991e-47e9-8494-ff24c8968a33\") " pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.353763 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99r7\" (UniqueName: \"kubernetes.io/projected/22760a3a-a198-4a84-8de8-20a745b3cb30-kube-api-access-x99r7\") pod \"service-ca-9c57cc56f-9c5t7\" (UID: \"22760a3a-a198-4a84-8de8-20a745b3cb30\") " pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.363841 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.373145 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.426142 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.426176 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.426729 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.426767 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.427156 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tzs\" (UniqueName: \"kubernetes.io/projected/0206012f-d861-4f45-9dfd-3923117fea31-kube-api-access-c2tzs\") pod \"ingress-canary-9pz94\" (UID: \"0206012f-d861-4f45-9dfd-3923117fea31\") " pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.431017 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.431477 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:23.931462995 +0000 UTC m=+143.147123945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.436086 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq8bg\" (UniqueName: \"kubernetes.io/projected/f721b883-a5eb-4ecb-9565-e03cdb24c368-kube-api-access-wq8bg\") pod \"service-ca-operator-777779d784-drwnh\" (UID: \"f721b883-a5eb-4ecb-9565-e03cdb24c368\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.438494 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.444240 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmc6\" (UniqueName: \"kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6\") pod \"collect-profiles-29414895-vsxsw\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.444245 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4bz\" (UniqueName: \"kubernetes.io/projected/11f20e7f-9d7e-4574-928a-42697c2fdb81-kube-api-access-9x4bz\") pod \"migrator-59844c95c7-6d754\" (UID: \"11f20e7f-9d7e-4574-928a-42697c2fdb81\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.445130 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.451099 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmss\" (UniqueName: \"kubernetes.io/projected/dc43ed3f-937d-4de9-9e4a-301788d5d19d-kube-api-access-tcmss\") pod \"machine-config-server-9r6hq\" (UID: \"dc43ed3f-937d-4de9-9e4a-301788d5d19d\") " pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.460433 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.461780 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntp6z\" (UniqueName: \"kubernetes.io/projected/1a9ce926-4d8b-4608-9c75-9ddbc87a2464-kube-api-access-ntp6z\") pod \"control-plane-machine-set-operator-78cbb6b69f-2zx9b\" (UID: \"1a9ce926-4d8b-4608-9c75-9ddbc87a2464\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.465061 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.473877 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9pz94" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.488965 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5k9\" (UniqueName: \"kubernetes.io/projected/650c0e29-8158-4c6a-9b4f-2d6705ca4e87-kube-api-access-xv5k9\") pod \"router-default-5444994796-mhbwk\" (UID: \"650c0e29-8158-4c6a-9b4f-2d6705ca4e87\") " pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.491706 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9r6hq" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.503072 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp58s\" (UniqueName: \"kubernetes.io/projected/70145bc3-1bf0-48e3-ab5d-84b0e04dfd69-kube-api-access-rp58s\") pod \"dns-default-f9knl\" (UID: \"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69\") " pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.507889 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dtbm8"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.507935 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ntfxg"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.516196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.518630 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5p86\" (UniqueName: \"kubernetes.io/projected/2183c13d-b67c-403a-8723-8c62e3ad57f3-kube-api-access-t5p86\") pod \"package-server-manager-789f6589d5-66mwb\" (UID: \"2183c13d-b67c-403a-8723-8c62e3ad57f3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.531722 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.533871 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.534384 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.034360672 +0000 UTC m=+143.250021632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.537805 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78n8t\" (UniqueName: \"kubernetes.io/projected/ff68e70c-0561-4324-8cd4-1c8897cff45b-kube-api-access-78n8t\") pod \"machine-config-controller-84d6567774-pc8pz\" (UID: \"ff68e70c-0561-4324-8cd4-1c8897cff45b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.562427 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h7jx\" (UniqueName: \"kubernetes.io/projected/fc1ec533-684e-4f58-8861-adc357bf448e-kube-api-access-9h7jx\") pod \"kube-storage-version-migrator-operator-b67b599dd-dpv8g\" (UID: \"fc1ec533-684e-4f58-8861-adc357bf448e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.582263 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.585172 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6sm\" (UniqueName: \"kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm\") pod \"marketplace-operator-79b997595-8vtnn\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.602420 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.604927 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.608753 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.614089 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.614943 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.622113 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.622240 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.629571 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.634794 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.635014 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.134969153 +0000 UTC m=+143.350630113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.635082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.635613 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.135598928 +0000 UTC m=+143.351259878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.635613 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.637457 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.642284 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.657552 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.660294 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.673842 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.692826 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.692923 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.693926 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.702079 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.703137 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.706085 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76k4t\" (UniqueName: \"kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t\") pod \"controller-manager-879f6c89f-7hqjc\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.712777 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.712819 4759 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.712863 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.712840319 +0000 UTC m=+143.928501339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.712903 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session podName:84b8f271-fcc3-4014-8a36-3e7019bef7c5 nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.71288599 +0000 UTC m=+143.928546940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session") pod "oauth-openshift-558db77b4-lr9vd" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5") : failed to sync secret cache: timed out waiting for the condition Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.720624 4759 request.go:700] Waited for 1.770912825s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-system-session&limit=500&resourceVersion=0 Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.722651 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.733152 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gng7x"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.770877 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.771172 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.27114058 +0000 UTC m=+143.486801530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.771293 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.774177 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.775335 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.275296021 +0000 UTC m=+143.490956971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.782946 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.791447 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.839864 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.859458 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.862367 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wf4wz"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.872489 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.872672 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.372646872 +0000 UTC m=+143.588307832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.872887 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.873220 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.373212396 +0000 UTC m=+143.588873336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.877786 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.969746 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" event={"ID":"bbbc5689-2333-4227-984e-57e82e237746","Type":"ContainerStarted","Data":"ced6d742ae3b0c0670f4d32aac53f06a2e9c3112ddd1e8f553303d461b0d85b8"} Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.972116 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.972158 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t"] Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.973205 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" event={"ID":"adb9d332-b13b-456d-9d04-32124d387a36","Type":"ContainerStarted","Data":"29d9f1d7be682b8a724f500c09a9707ffd120a2824e25268f5d7ccc0690ea10d"} Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.973705 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:23 crc kubenswrapper[4759]: E1205 00:25:23.974015 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.474002181 +0000 UTC m=+143.689663131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.976020 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" event={"ID":"415154e7-28be-49ac-954d-88342198e56e","Type":"ContainerStarted","Data":"681ac1bf5caa0a8502a111f826dac613fc67c1959bbeca5cc708f232bef5497d"} Dec 05 00:25:23 crc kubenswrapper[4759]: I1205 00:25:23.985233 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.004054 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppbrk"] Dec 05 00:25:24 crc kubenswrapper[4759]: W1205 00:25:24.061550 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d7ef91_e7f4_496d_bc04_6a9a4a5ea6d4.slice/crio-b3470ef51ae4a09d7a5d61958674218e3338adeb1889913832a2cddad3f814c8 WatchSource:0}: Error finding container b3470ef51ae4a09d7a5d61958674218e3338adeb1889913832a2cddad3f814c8: Status 404 returned error can't find the container with id b3470ef51ae4a09d7a5d61958674218e3338adeb1889913832a2cddad3f814c8 Dec 05 00:25:24 crc kubenswrapper[4759]: W1205 00:25:24.069199 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6840df22_cb55_402a_9138_567bcdae100c.slice/crio-45c5200b7eeef4022a5fedeb1418792a18ca0a27ca5345676b92534b0b855fed WatchSource:0}: Error finding container 45c5200b7eeef4022a5fedeb1418792a18ca0a27ca5345676b92534b0b855fed: Status 404 returned error can't find the container with id 45c5200b7eeef4022a5fedeb1418792a18ca0a27ca5345676b92534b0b855fed Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.075334 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.075827 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.575813881 +0000 UTC m=+143.791474831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.077626 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z"] Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.176745 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.177332 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.677282552 +0000 UTC m=+143.892943502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.314543 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.314816 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.814803491 +0000 UTC m=+144.030464431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.342564 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtldn"] Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.415326 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.415669 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:24.915653098 +0000 UTC m=+144.131314048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.516207 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.516802 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.016787551 +0000 UTC m=+144.232448501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.617908 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.618010 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.117987627 +0000 UTC m=+144.333648597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.618064 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.618424 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.118415567 +0000 UTC m=+144.334076517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.641952 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j"] Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.718770 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.718968 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.719032 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.719545 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.219527869 +0000 UTC m=+144.435188819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.727890 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.819951 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.820417 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.320396706 +0000 UTC m=+144.536057646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.921253 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.921479 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.421435708 +0000 UTC m=+144.637096668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.921798 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:24 crc kubenswrapper[4759]: E1205 00:25:24.922268 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.422247277 +0000 UTC m=+144.637908227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.998261 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" event={"ID":"9a888f59-c21c-4786-8a70-cdabfba7a293","Type":"ContainerStarted","Data":"d3472c23acfe6012fe026dbc370c809025519997146c1c596be7b656df1ea549"} Dec 05 00:25:24 crc kubenswrapper[4759]: I1205 00:25:24.999339 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" event={"ID":"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4","Type":"ContainerStarted","Data":"b3470ef51ae4a09d7a5d61958674218e3338adeb1889913832a2cddad3f814c8"} Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.000113 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" event={"ID":"f2b40743-a414-4dd8-9613-0bc14b937e3d","Type":"ContainerStarted","Data":"1416a4d26db16d8f4e5bad21f7f144b1afea6c259a60d8fbdb6607a37fd243ca"} Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.000823 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" event={"ID":"6840df22-cb55-402a-9138-567bcdae100c","Type":"ContainerStarted","Data":"45c5200b7eeef4022a5fedeb1418792a18ca0a27ca5345676b92534b0b855fed"} Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.025259 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.025831 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.52581655 +0000 UTC m=+144.741477500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.132546 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.133211 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.633197466 +0000 UTC m=+144.848858416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.161880 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6hsqf"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.172272 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wwwmq"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.177800 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.179819 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.189014 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.206781 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.213487 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.219372 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.230809 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.234876 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.235089 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.735076038 +0000 UTC m=+144.950736988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.256660 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9c5t7"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.336332 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.336637 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.836625911 +0000 UTC m=+145.052286861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.408287 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-drwnh"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.412206 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8p7g7"] Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.436732 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.436857 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.936832782 +0000 UTC m=+145.152493732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.437081 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.437413 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:25.937400935 +0000 UTC m=+145.153061885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.538254 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.538434 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.038412805 +0000 UTC m=+145.254073765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.538769 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.539127 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.039115393 +0000 UTC m=+145.254776353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.639938 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.640136 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.140113983 +0000 UTC m=+145.355774933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.640275 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.640604 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.140596254 +0000 UTC m=+145.356257204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.741299 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.741634 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.241600404 +0000 UTC m=+145.457261414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.741761 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.742112 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.242095237 +0000 UTC m=+145.457756187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.807655 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr9vd\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:25 crc kubenswrapper[4759]: W1205 00:25:25.818510 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5488f06_06a1_48b6_9103_abff66383776.slice/crio-5310150db80256c01c73f64dc48a438d6a37fd5a911edfbc172d4388057d7071 WatchSource:0}: Error finding container 5310150db80256c01c73f64dc48a438d6a37fd5a911edfbc172d4388057d7071: Status 404 returned error can't find the container with id 5310150db80256c01c73f64dc48a438d6a37fd5a911edfbc172d4388057d7071 Dec 05 00:25:25 crc kubenswrapper[4759]: W1205 00:25:25.822235 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99bc61a_b820_4ebd_8ed0_d18cba6c017a.slice/crio-a8c1eccf038db62e442abd265c45e8601add626e9f261d72ddf6820b6ae180c2 WatchSource:0}: Error finding container a8c1eccf038db62e442abd265c45e8601add626e9f261d72ddf6820b6ae180c2: Status 404 returned error can't find the container with id a8c1eccf038db62e442abd265c45e8601add626e9f261d72ddf6820b6ae180c2 Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.842472 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.842688 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.342649776 +0000 UTC m=+145.558310726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.842843 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.843463 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.343455166 +0000 UTC m=+145.559116116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.938547 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:25 crc kubenswrapper[4759]: I1205 00:25:25.943940 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:25 crc kubenswrapper[4759]: E1205 00:25:25.944353 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.444332913 +0000 UTC m=+145.659993863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:25 crc kubenswrapper[4759]: W1205 00:25:25.973434 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27f029e_62d3_497f_8f69_a95229ebe945.slice/crio-126adc6378d971a8649bfba30117398aca5549778a5f19f8487d631757982e5a WatchSource:0}: Error finding container 126adc6378d971a8649bfba30117398aca5549778a5f19f8487d631757982e5a: Status 404 returned error can't find the container with id 126adc6378d971a8649bfba30117398aca5549778a5f19f8487d631757982e5a Dec 05 00:25:25 crc kubenswrapper[4759]: W1205 00:25:25.976776 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2031118b_50a5_4d09_afab_37e601a1631e.slice/crio-10fa9586f7f77df8803c51277232f81871b5a1894f597c3e84a6dfb6473a4c5f WatchSource:0}: Error finding container 10fa9586f7f77df8803c51277232f81871b5a1894f597c3e84a6dfb6473a4c5f: Status 404 returned error can't find the container with id 10fa9586f7f77df8803c51277232f81871b5a1894f597c3e84a6dfb6473a4c5f Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.035492 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" event={"ID":"56536c3b-13f4-4ead-a2e4-2c30a87ef64c","Type":"ContainerStarted","Data":"30e79baae85d5306f4c772fafe2d260ddb664e419b1815c39e661f19b33bbcf5"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.045749 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.046805 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.546793088 +0000 UTC m=+145.762454038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.046940 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" event={"ID":"398957bf-f56a-4d8c-8e79-73bc19356c88","Type":"ContainerStarted","Data":"b37dedcb125b5a3a702b6130fe0449dee83a59cfec1c22a889f32b57888fd08d"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.051941 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" event={"ID":"40f30299-1808-43a6-83db-44e27fa0b18e","Type":"ContainerStarted","Data":"98e1032609148bd1fd813276720778ff18cbefb7e993df8cc23c4ec666e4de6d"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.052908 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" event={"ID":"03b74b39-100a-4fe4-8bae-0f2088728e24","Type":"ContainerStarted","Data":"de2d04cdbdc943df94c20c21aea781e8367b3def33fbfb43939fd46ee277547b"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.053646 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vtldn" event={"ID":"f27f029e-62d3-497f-8f69-a95229ebe945","Type":"ContainerStarted","Data":"126adc6378d971a8649bfba30117398aca5549778a5f19f8487d631757982e5a"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.054337 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" event={"ID":"d5488f06-06a1-48b6-9103-abff66383776","Type":"ContainerStarted","Data":"5310150db80256c01c73f64dc48a438d6a37fd5a911edfbc172d4388057d7071"} Dec 05 00:25:26 crc kubenswrapper[4759]: W1205 00:25:26.092204 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c369f5e_c51c_46f0_a184_7e9c627451f4.slice/crio-d0ef1483570e7febc3f10fc7aa80c5d1ebd647a283411f5ba443b2600e16c327 WatchSource:0}: Error finding container d0ef1483570e7febc3f10fc7aa80c5d1ebd647a283411f5ba443b2600e16c327: Status 404 returned error can't find the container with id d0ef1483570e7febc3f10fc7aa80c5d1ebd647a283411f5ba443b2600e16c327 Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.125928 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" event={"ID":"09216400-991e-47e9-8494-ff24c8968a33","Type":"ContainerStarted","Data":"183f97255c548bc8830944abbb11ae0bab65ea77d5b2f114a4440a7be8ce8360"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.142227 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" event={"ID":"2031118b-50a5-4d09-afab-37e601a1631e","Type":"ContainerStarted","Data":"10fa9586f7f77df8803c51277232f81871b5a1894f597c3e84a6dfb6473a4c5f"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.145762 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" event={"ID":"cb7e479b-65fb-45dc-bf4b-cda530317c77","Type":"ContainerStarted","Data":"b67a1c5b1939a0445e02df2ddd9a5177353b6c81352bd3ef3b31de325a8aabdd"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.149028 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.149617 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.649593193 +0000 UTC m=+145.865254143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.150083 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wf4wz" event={"ID":"f99bc61a-b820-4ebd-8ed0-d18cba6c017a","Type":"ContainerStarted","Data":"a8c1eccf038db62e442abd265c45e8601add626e9f261d72ddf6820b6ae180c2"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.152360 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9r6hq" event={"ID":"dc43ed3f-937d-4de9-9e4a-301788d5d19d","Type":"ContainerStarted","Data":"dfbffbaa78b52ce68b6af0b3210a8c47defbe420fe70fb21e57cd0c1bac13ac4"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.153099 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" event={"ID":"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291","Type":"ContainerStarted","Data":"12967bc573a50901464cbb05c1f679c059619487b54a6ae04723edbc1f7369ba"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.155362 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8mxq" event={"ID":"576c976f-56ce-4409-8654-e9a6264a71d1","Type":"ContainerStarted","Data":"a140b1bdb62a6b99086c8f3fbdcf7a4e1d9979505a0efb9e232d918a05505ea8"} Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.251197 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.251680 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.751660779 +0000 UTC m=+145.967321809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.352873 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.353056 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.853028558 +0000 UTC m=+146.068689508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.353331 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.353684 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.853676403 +0000 UTC m=+146.069337353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.454881 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.455126 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.955075743 +0000 UTC m=+146.170736693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.455229 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.455715 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:26.955697698 +0000 UTC m=+146.171358658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.557788 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29414880-vdbj2"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.559992 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.560393 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.060373858 +0000 UTC m=+146.276034808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.564814 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.570508 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.571941 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9pz94"] Dec 05 00:25:26 crc kubenswrapper[4759]: W1205 00:25:26.602792 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0206012f_d861_4f45_9dfd_3923117fea31.slice/crio-54fce3e99c032d2600d315f55a214e097169c7d4bc5a94e4b0574d37e28f49d7 WatchSource:0}: Error finding container 54fce3e99c032d2600d315f55a214e097169c7d4bc5a94e4b0574d37e28f49d7: Status 404 returned error can't find the container with id 54fce3e99c032d2600d315f55a214e097169c7d4bc5a94e4b0574d37e28f49d7 Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.670913 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.671788 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.171738661 +0000 UTC m=+146.387399611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.771746 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.772159 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.272138706 +0000 UTC m=+146.487799656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.840389 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.860255 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9knl"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.873403 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.873714 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.37370078 +0000 UTC m=+146.589361730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.892575 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.903468 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb"] Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.973858 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.974016 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.473988402 +0000 UTC m=+146.689649352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:26 crc kubenswrapper[4759]: I1205 00:25:26.974184 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:26 crc kubenswrapper[4759]: E1205 00:25:26.974517 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.474505225 +0000 UTC m=+146.690166175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.074982 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.075174 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.575148566 +0000 UTC m=+146.790809516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.075569 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.075948 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.575935925 +0000 UTC m=+146.791596935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: W1205 00:25:27.123241 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45df7d8a_597a_42b0_8116_37bf7d3e7627.slice/crio-39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6 WatchSource:0}: Error finding container 39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6: Status 404 returned error can't find the container with id 39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6 Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.129361 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g"] Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.134750 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw"] Dec 05 00:25:27 crc kubenswrapper[4759]: W1205 00:25:27.156227 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d64b1f1_f632_4956_a9d8_83703ef96ca1.slice/crio-ad8e38cbfbac8dd01bd7223f364bc86e030c7c0ddada8b67eec0421e28b3468c WatchSource:0}: Error finding container ad8e38cbfbac8dd01bd7223f364bc86e030c7c0ddada8b67eec0421e28b3468c: Status 404 returned error can't find the container with id ad8e38cbfbac8dd01bd7223f364bc86e030c7c0ddada8b67eec0421e28b3468c Dec 05 00:25:27 crc kubenswrapper[4759]: W1205 00:25:27.159783 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650c0e29_8158_4c6a_9b4f_2d6705ca4e87.slice/crio-d0fe1749847b1962868fc09fcf44b1010f511dc1d62b52078b67611725aabac3 WatchSource:0}: Error finding container d0fe1749847b1962868fc09fcf44b1010f511dc1d62b52078b67611725aabac3: Status 404 returned error can't find the container with id d0fe1749847b1962868fc09fcf44b1010f511dc1d62b52078b67611725aabac3 Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.165849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9pz94" event={"ID":"0206012f-d861-4f45-9dfd-3923117fea31","Type":"ContainerStarted","Data":"54fce3e99c032d2600d315f55a214e097169c7d4bc5a94e4b0574d37e28f49d7"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.167037 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29414880-vdbj2" event={"ID":"45df7d8a-597a-42b0-8116-37bf7d3e7627","Type":"ContainerStarted","Data":"39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.167927 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" event={"ID":"86a28668-36ac-4656-b1bb-1fd68c51e6de","Type":"ContainerStarted","Data":"8975a72a1f416844807a0b5d4c9aee9c6ac9736f9f38ed088d146f1cfffb337e"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.171119 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" event={"ID":"3c369f5e-c51c-46f0-a184-7e9c627451f4","Type":"ContainerStarted","Data":"d0ef1483570e7febc3f10fc7aa80c5d1ebd647a283411f5ba443b2600e16c327"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.172786 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" event={"ID":"22760a3a-a198-4a84-8de8-20a745b3cb30","Type":"ContainerStarted","Data":"4846aeb59ad184576104aa1dec8be3391e73af1886f040e85ef811dde7e93235"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.173830 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" event={"ID":"f721b883-a5eb-4ecb-9565-e03cdb24c368","Type":"ContainerStarted","Data":"7c0bf5af07ee8f66cf1206e8d2bbb622675a3c965e9c217999590b8c20dc44ff"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.174751 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" event={"ID":"b9d750eb-071c-4580-b626-26b375e56870","Type":"ContainerStarted","Data":"cc9cf99996d55d975c70bfa539f6acd296b25bb445d86da282455bc727337054"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.175490 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" event={"ID":"1a9ce926-4d8b-4608-9c75-9ddbc87a2464","Type":"ContainerStarted","Data":"155dd6b12593a586e2565812b99ee6b55207a4ec103b4f93ac3025180b57981a"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.175952 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.176138 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.676118676 +0000 UTC m=+146.891779626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.176249 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.176568 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.676555546 +0000 UTC m=+146.892216496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.176570 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" event={"ID":"ca2cc56f-b0a3-4e49-a9dd-c8918810f423","Type":"ContainerStarted","Data":"e5f3b6fdfdfae38eba74bbca00ffae5a451f194f82deb4d9ef373b8729284a94"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.177274 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" event={"ID":"392f2eff-cfc4-431e-83ad-a8c9328aa8a8","Type":"ContainerStarted","Data":"4bc54c2c5b08d0ac1d1fdfafd36b8c0aee61fe48b40f10365aa7d02bbd5f3d64"} Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.277008 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.277179 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.777157687 +0000 UTC m=+146.992818637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.277253 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.277569 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.777560746 +0000 UTC m=+146.993221696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.378231 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.378415 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.878390642 +0000 UTC m=+147.094051592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.378507 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.378791 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.878783433 +0000 UTC m=+147.094444383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.479868 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.480086 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.980060439 +0000 UTC m=+147.195721389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.480515 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.480837 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:27.980825647 +0000 UTC m=+147.196486597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.581706 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.581880 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.081858299 +0000 UTC m=+147.297519259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.581951 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.582438 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.082416342 +0000 UTC m=+147.298077302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.683626 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.683878 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.183844832 +0000 UTC m=+147.399505822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.684263 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.684815 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.184791485 +0000 UTC m=+147.400452475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.785018 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.785235 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.285213342 +0000 UTC m=+147.500874302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.886787 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.887354 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.387332169 +0000 UTC m=+147.602993159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.988021 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.988283 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.488252457 +0000 UTC m=+147.703913417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:27 crc kubenswrapper[4759]: I1205 00:25:27.988432 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:27 crc kubenswrapper[4759]: E1205 00:25:27.988990 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.488963794 +0000 UTC m=+147.704624774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.089722 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.089913 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.589879332 +0000 UTC m=+147.805540332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.090197 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.090867 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.590840726 +0000 UTC m=+147.806501716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.184802 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" event={"ID":"0d64b1f1-f632-4956-a9d8-83703ef96ca1","Type":"ContainerStarted","Data":"ad8e38cbfbac8dd01bd7223f364bc86e030c7c0ddada8b67eec0421e28b3468c"} Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.186276 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mhbwk" event={"ID":"650c0e29-8158-4c6a-9b4f-2d6705ca4e87","Type":"ContainerStarted","Data":"d0fe1749847b1962868fc09fcf44b1010f511dc1d62b52078b67611725aabac3"} Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.190940 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.191448 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.691412316 +0000 UTC m=+147.907073316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.293608 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.294245 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.794222259 +0000 UTC m=+148.009883239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.394676 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.394853 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.894817139 +0000 UTC m=+148.110478129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.394959 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.395451 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.895424825 +0000 UTC m=+148.111085815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.496816 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.497036 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.996988318 +0000 UTC m=+148.212649318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.497171 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.497683 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:28.997664395 +0000 UTC m=+148.213325385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.598418 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.598581 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.098552962 +0000 UTC m=+148.314213942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.598751 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.599196 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.099179867 +0000 UTC m=+148.314840847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.700334 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.700639 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.200598978 +0000 UTC m=+148.416259958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.802183 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.802670 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.302643414 +0000 UTC m=+148.518304384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.903576 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.903789 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.403755076 +0000 UTC m=+148.619416056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: I1205 00:25:28.904043 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:28 crc kubenswrapper[4759]: E1205 00:25:28.904548 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.404502104 +0000 UTC m=+148.620163084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:28 crc kubenswrapper[4759]: W1205 00:25:28.941273 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70145bc3_1bf0_48e3_ab5d_84b0e04dfd69.slice/crio-002d9b2381c455a3aee01ddfb163571661d6948f0a45fbb8bc1e8af1d5d2cfef WatchSource:0}: Error finding container 002d9b2381c455a3aee01ddfb163571661d6948f0a45fbb8bc1e8af1d5d2cfef: Status 404 returned error can't find the container with id 002d9b2381c455a3aee01ddfb163571661d6948f0a45fbb8bc1e8af1d5d2cfef Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.005119 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.005582 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.505553505 +0000 UTC m=+148.721214475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.005718 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.006067 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.506054168 +0000 UTC m=+148.721715118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.106398 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.106681 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.606661589 +0000 UTC m=+148.822322539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.127845 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.193637 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9knl" event={"ID":"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69","Type":"ContainerStarted","Data":"002d9b2381c455a3aee01ddfb163571661d6948f0a45fbb8bc1e8af1d5d2cfef"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.195144 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" event={"ID":"11f20e7f-9d7e-4574-928a-42697c2fdb81","Type":"ContainerStarted","Data":"abb1de63731e2bde09d73c287960663e8f2ba8ee73f21a73f9eb6d2f80439361"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.196540 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" event={"ID":"fc1ec533-684e-4f58-8861-adc357bf448e","Type":"ContainerStarted","Data":"9ce59db3b9c3298b4051f6a7b8481dfa9211c5b9d0139cd4939600d8ff0d9061"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.197831 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" event={"ID":"a6019120-bf7b-47df-9e54-c7761066eb48","Type":"ContainerStarted","Data":"7c26b3bd442afedaced6769ce0f0908652e8c9b0eaa1151f60163aad8678f91a"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.198611 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" event={"ID":"08f680e4-29bc-4ffc-962b-1a3151e5e41f","Type":"ContainerStarted","Data":"07d4aa1b8f629b281b7d5d9de2a3073eea8da89b5ea3f77b80c1e593613549e9"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.199186 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" event={"ID":"9ba656f8-77bb-4402-8242-6fe3b116a8cc","Type":"ContainerStarted","Data":"ff6788171f883e26326545b2543f8e46cf6ce186f4e6d0dd47c76639e92beb8e"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.199797 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" event={"ID":"2183c13d-b67c-403a-8723-8c62e3ad57f3","Type":"ContainerStarted","Data":"7d6e427a446245b71dea355bca8f0dd79b0415645286fab3b337c891c7c2c664"} Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.208816 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.209252 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.709236117 +0000 UTC m=+148.924897067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.240810 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.293059 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz"] Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.309569 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.309738 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.809718104 +0000 UTC m=+149.025379064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.309824 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.310132 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.810122774 +0000 UTC m=+149.025783734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.410996 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.411234 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.911200446 +0000 UTC m=+149.126861436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.411540 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.412003 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:29.911982425 +0000 UTC m=+149.127643415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.513410 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.513891 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.013865726 +0000 UTC m=+149.229526706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.615232 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.615816 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.115790679 +0000 UTC m=+149.331451669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: W1205 00:25:29.632167 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae425cf_9dc3_471e_a2b8_506eedb29c8d.slice/crio-837c4eb1c40cd901e186850f9c548a4e6d158062ef39fb804b6185535a1888c9 WatchSource:0}: Error finding container 837c4eb1c40cd901e186850f9c548a4e6d158062ef39fb804b6185535a1888c9: Status 404 returned error can't find the container with id 837c4eb1c40cd901e186850f9c548a4e6d158062ef39fb804b6185535a1888c9 Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.716327 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.716810 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.216787229 +0000 UTC m=+149.432448189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.820039 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.820394 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.320379393 +0000 UTC m=+149.536040343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:29 crc kubenswrapper[4759]: I1205 00:25:29.921116 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:29 crc kubenswrapper[4759]: E1205 00:25:29.921755 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.421734431 +0000 UTC m=+149.637395381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.022220 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.022853 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.522837294 +0000 UTC m=+149.738498244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.123070 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.123251 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.123272 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.123322 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.123344 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.134874 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.134931 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.634905693 +0000 UTC m=+149.850566643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.140444 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.140479 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.140720 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.181651 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.224969 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.225384 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.725370997 +0000 UTC m=+149.941031937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.303254 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" event={"ID":"415154e7-28be-49ac-954d-88342198e56e","Type":"ContainerStarted","Data":"748af7348ed209758a25c60deb2bc8c5d5469df04d0fa3630862bf185c5f01ac"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.315043 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" event={"ID":"84b8f271-fcc3-4014-8a36-3e7019bef7c5","Type":"ContainerStarted","Data":"8bae4a32a2d8ee1d41aec691844fc5cb647da5a4a3805811ad7b04664500adc0"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.325577 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.325907 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.825894015 +0000 UTC m=+150.041554965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.326041 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" event={"ID":"b9d750eb-071c-4580-b626-26b375e56870","Type":"ContainerStarted","Data":"29f13714c2200fbe04b0f396c853c3cf7af0ff07fa9b344548a5371229ad0e82"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.326827 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.346760 4759 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-s8pqf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.346815 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" podUID="b9d750eb-071c-4580-b626-26b375e56870" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.361249 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dtbm8" podStartSLOduration=129.361230406 podStartE2EDuration="2m9.361230406s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.360747864 +0000 UTC m=+149.576408814" watchObservedRunningTime="2025-12-05 00:25:30.361230406 +0000 UTC m=+149.576891356" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.369934 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.380345 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.380854 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" podStartSLOduration=128.380821083 podStartE2EDuration="2m8.380821083s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.379766267 +0000 UTC m=+149.595427217" watchObservedRunningTime="2025-12-05 00:25:30.380821083 +0000 UTC m=+149.596482033" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.417750 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" event={"ID":"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291","Type":"ContainerStarted","Data":"787a74362320e00e732eaf40e5f4fae59107fdf779c2bcaf3d5905f555f8309d"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.427357 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" event={"ID":"d5488f06-06a1-48b6-9103-abff66383776","Type":"ContainerStarted","Data":"08fcb7b6bd36c8e31da58d09f28c4ffdf458cc4c8908ed301d364c633e079299"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.430196 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.432280 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:30.932253935 +0000 UTC m=+150.147914885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.448439 4759 generic.go:334] "Generic (PLEG): container finished" podID="adb9d332-b13b-456d-9d04-32124d387a36" containerID="c7f027ddc0768f41c43b6e42d72266c42423e96ba49ec9f1a0d748ea3428ed44" exitCode=0 Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.448546 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" event={"ID":"adb9d332-b13b-456d-9d04-32124d387a36","Type":"ContainerDied","Data":"c7f027ddc0768f41c43b6e42d72266c42423e96ba49ec9f1a0d748ea3428ed44"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.450579 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rzm5t" podStartSLOduration=129.450560101 podStartE2EDuration="2m9.450560101s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.448462941 +0000 UTC m=+149.664123891" watchObservedRunningTime="2025-12-05 00:25:30.450560101 +0000 UTC m=+149.666221051" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.463452 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" event={"ID":"0ae425cf-9dc3-471e-a2b8-506eedb29c8d","Type":"ContainerStarted","Data":"837c4eb1c40cd901e186850f9c548a4e6d158062ef39fb804b6185535a1888c9"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.469247 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" event={"ID":"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4","Type":"ContainerStarted","Data":"98becd9e2abddfc005f924cb393a0c9314b79ecfb6eb641c601bfc5252850bf8"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.474212 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" event={"ID":"392f2eff-cfc4-431e-83ad-a8c9328aa8a8","Type":"ContainerStarted","Data":"974f6f0424e3b3461936f8d53dbcd79e6dd429ef2d7af10b134d3b66925c7d76"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.475135 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.476438 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vtldn" event={"ID":"f27f029e-62d3-497f-8f69-a95229ebe945","Type":"ContainerStarted","Data":"1f86b59434aab55e1eacf3009bb1a357fa843495042eaa090e9fe837df22cdc0"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.477094 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.479901 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" event={"ID":"f2b40743-a414-4dd8-9613-0bc14b937e3d","Type":"ContainerStarted","Data":"357f5b8e586720a4a16370b69d195c8849286c5f96b27b50a43af354a124bf10"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.482123 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9r6hq" event={"ID":"dc43ed3f-937d-4de9-9e4a-301788d5d19d","Type":"ContainerStarted","Data":"e355e0094113b8b6f5e03b414fa00d4c2012e0bd5b6e5c3fefd4df086bc55f81"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.483740 4759 patch_prober.go:28] interesting pod/console-operator-58897d9998-vtldn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.483771 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vtldn" podUID="f27f029e-62d3-497f-8f69-a95229ebe945" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.483783 4759 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fpl7m container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.483829 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" podUID="392f2eff-cfc4-431e-83ad-a8c9328aa8a8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.484784 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" event={"ID":"9a888f59-c21c-4786-8a70-cdabfba7a293","Type":"ContainerStarted","Data":"7c5a410c68dd3158e83d09d15652d11c9e5cfc0d704a4401cfa6d9b3b542bd96"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.501402 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9r6hq" podStartSLOduration=10.50138241 podStartE2EDuration="10.50138241s" podCreationTimestamp="2025-12-05 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.499783631 +0000 UTC m=+149.715444581" watchObservedRunningTime="2025-12-05 00:25:30.50138241 +0000 UTC m=+149.717043360" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.502931 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8mxq" event={"ID":"576c976f-56ce-4409-8654-e9a6264a71d1","Type":"ContainerStarted","Data":"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.509530 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" event={"ID":"ff68e70c-0561-4324-8cd4-1c8897cff45b","Type":"ContainerStarted","Data":"2b5820fc5f28f89e608e6834deeedf30bb11209ca93443f1581e1556b54c0f29"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.513141 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" event={"ID":"bbbc5689-2333-4227-984e-57e82e237746","Type":"ContainerStarted","Data":"5891a156bb0317bb9df93448b119cccb843d79f6680ab018d1a11c5e7bd18bd4"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.516986 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" event={"ID":"398957bf-f56a-4d8c-8e79-73bc19356c88","Type":"ContainerStarted","Data":"6069878dd8e0c4f463852cc66a7bc65e92d6d37762375c3ad0c2c546e4d04936"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.520899 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" event={"ID":"2031118b-50a5-4d09-afab-37e601a1631e","Type":"ContainerStarted","Data":"78b9dd9f36f79a8ce57398d3f25b628391b2dcb10e0a6f83746046b73edf07bd"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.528846 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" event={"ID":"3c369f5e-c51c-46f0-a184-7e9c627451f4","Type":"ContainerStarted","Data":"efc9c8ecb3e8bf2d4cf1327b5f22b2784db3bac3b6c15378285d0769f3cf087a"} Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.533203 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.534231 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.034209589 +0000 UTC m=+150.249870539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.547627 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" podStartSLOduration=128.54735511 podStartE2EDuration="2m8.54735511s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.543152948 +0000 UTC m=+149.758813898" watchObservedRunningTime="2025-12-05 00:25:30.54735511 +0000 UTC m=+149.763016060" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.562295 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vtldn" podStartSLOduration=129.562279323 podStartE2EDuration="2m9.562279323s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.561098404 +0000 UTC m=+149.776759354" watchObservedRunningTime="2025-12-05 00:25:30.562279323 +0000 UTC m=+149.777940273" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.592853 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-slm7z" podStartSLOduration=128.592827697 podStartE2EDuration="2m8.592827697s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.591051954 +0000 UTC m=+149.806712914" watchObservedRunningTime="2025-12-05 00:25:30.592827697 +0000 UTC m=+149.808488657" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.607873 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g8mxq" podStartSLOduration=129.607857093 podStartE2EDuration="2m9.607857093s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.606208833 +0000 UTC m=+149.821869783" watchObservedRunningTime="2025-12-05 00:25:30.607857093 +0000 UTC m=+149.823518043" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.622641 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6fgg" podStartSLOduration=128.622609372 podStartE2EDuration="2m8.622609372s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.619905856 +0000 UTC m=+149.835566806" watchObservedRunningTime="2025-12-05 00:25:30.622609372 +0000 UTC m=+149.838270322" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.636488 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.636602 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5b4pd" podStartSLOduration=129.636583102 podStartE2EDuration="2m9.636583102s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:30.635664311 +0000 UTC m=+149.851325281" watchObservedRunningTime="2025-12-05 00:25:30.636583102 +0000 UTC m=+149.852244052" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.640062 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.140049437 +0000 UTC m=+150.355710487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.738189 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.738542 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.238515035 +0000 UTC m=+150.454175995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.738723 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.739282 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.239247524 +0000 UTC m=+150.454908474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.839935 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.840074 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.340047019 +0000 UTC m=+150.555707969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.840403 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.840820 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.340808427 +0000 UTC m=+150.556469427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:30 crc kubenswrapper[4759]: I1205 00:25:30.943999 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:30 crc kubenswrapper[4759]: E1205 00:25:30.944421 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.44440184 +0000 UTC m=+150.660062790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.053295 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.053824 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.553814015 +0000 UTC m=+150.769474965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.154451 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.154800 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.654783864 +0000 UTC m=+150.870444814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.260958 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.261214 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.761204457 +0000 UTC m=+150.976865407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.361939 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.362653 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.862629208 +0000 UTC m=+151.078290158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.478044 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.478446 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:31.978431558 +0000 UTC m=+151.194092508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.582933 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.583379 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.083298182 +0000 UTC m=+151.298959132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.602483 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" event={"ID":"1a9ce926-4d8b-4608-9c75-9ddbc87a2464","Type":"ContainerStarted","Data":"58f0dfbca84b6994af09785b8c0bd55ee4d99cca181aca47ac2024c536bc6e2b"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.619614 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" event={"ID":"86a28668-36ac-4656-b1bb-1fd68c51e6de","Type":"ContainerStarted","Data":"c5db290dc28f556a4fae5c7063b8d4d5afeb17eeffee3f63512d7f37e0f510da"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.643631 4759 generic.go:334] "Generic (PLEG): container finished" podID="02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4" containerID="98becd9e2abddfc005f924cb393a0c9314b79ecfb6eb641c601bfc5252850bf8" exitCode=0 Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.643731 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" event={"ID":"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4","Type":"ContainerDied","Data":"98becd9e2abddfc005f924cb393a0c9314b79ecfb6eb641c601bfc5252850bf8"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.644940 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" event={"ID":"22760a3a-a198-4a84-8de8-20a745b3cb30","Type":"ContainerStarted","Data":"5463c05433f8af7e4dbf39d0065b9e25f07b53b23eb082c9efcfef5ee2de1a16"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.660528 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wf4wz" event={"ID":"f99bc61a-b820-4ebd-8ed0-d18cba6c017a","Type":"ContainerStarted","Data":"01ed31f23320651f59fd3c2115392c162c0e0d3528addccf0d3137478b4b1459"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.660587 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.663508 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" event={"ID":"09216400-991e-47e9-8494-ff24c8968a33","Type":"ContainerStarted","Data":"4ce67f8bc1d172e6260f09ab3291e31c5e86fb7fa7df85f756572b900f4234e7"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.664697 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.664749 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.665104 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29414880-vdbj2" event={"ID":"45df7d8a-597a-42b0-8116-37bf7d3e7627","Type":"ContainerStarted","Data":"856838aa559f79a83b467a824dfdf54073acf1614c264906d5b28fa17b422af0"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.683028 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" event={"ID":"56536c3b-13f4-4ead-a2e4-2c30a87ef64c","Type":"ContainerStarted","Data":"113c1c012b8602126f54e7e9489d6eb29b02590ec294553e5109ed68cf0ac1b8"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.718263 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.720372 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.22035655 +0000 UTC m=+151.436017560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.753810 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" event={"ID":"40f30299-1808-43a6-83db-44e27fa0b18e","Type":"ContainerStarted","Data":"272fe5e118ce2512a90fa90d1c3c19a2d1af09cab2777674da3ea1c63a41e17e"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.755064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" event={"ID":"03b74b39-100a-4fe4-8bae-0f2088728e24","Type":"ContainerStarted","Data":"a24e5c2fcf90b9e7983c03edb648e0f2e1c62b5b3f820e1cc876a9fd3bc57b6b"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.755516 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.763736 4759 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6tr79 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.763789 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" podUID="03b74b39-100a-4fe4-8bae-0f2088728e24" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.822019 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.822187 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.32216108 +0000 UTC m=+151.537822020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.822367 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.822717 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.322703453 +0000 UTC m=+151.538364403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.845526 4759 generic.go:334] "Generic (PLEG): container finished" podID="ca2cc56f-b0a3-4e49-a9dd-c8918810f423" containerID="5e30d2461fe29249f97d26b89a7e73a36e418e2069bce9e57918e52d2bc72d55" exitCode=0 Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.845622 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" event={"ID":"ca2cc56f-b0a3-4e49-a9dd-c8918810f423","Type":"ContainerDied","Data":"5e30d2461fe29249f97d26b89a7e73a36e418e2069bce9e57918e52d2bc72d55"} Dec 05 00:25:31 crc kubenswrapper[4759]: I1205 00:25:31.923967 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:31 crc kubenswrapper[4759]: E1205 00:25:31.925095 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.425080048 +0000 UTC m=+151.640740998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:31.974501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" event={"ID":"f721b883-a5eb-4ecb-9565-e03cdb24c368","Type":"ContainerStarted","Data":"365ef5981de9afa7ba7f775ac50402185d80d43b4e924ddfb22f0dd3613c88c6"} Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:31.991183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9pz94" event={"ID":"0206012f-d861-4f45-9dfd-3923117fea31","Type":"ContainerStarted","Data":"4bb01a135ad820e9e53fec15e4c7f29cc09dadf2f3d3234b68537834fc4100ad"} Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.035531 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.037468 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.537455174 +0000 UTC m=+151.753116124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.046940 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" event={"ID":"6840df22-cb55-402a-9138-567bcdae100c","Type":"ContainerStarted","Data":"3b979ef1c3b93d4e52d396101c4ec905e7172d4a34bea43da7388ac1757addaa"} Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.072892 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" event={"ID":"cb7e479b-65fb-45dc-bf4b-cda530317c77","Type":"ContainerStarted","Data":"ef45c71360c6d848e30f30e5f112d9f1af0ad7ea23ccbd2760feac17193103fd"} Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.074996 4759 patch_prober.go:28] interesting pod/console-operator-58897d9998-vtldn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.075028 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vtldn" podUID="f27f029e-62d3-497f-8f69-a95229ebe945" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.092474 4759 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-s8pqf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.092522 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" podUID="b9d750eb-071c-4580-b626-26b375e56870" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.155116 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.157834 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.657816556 +0000 UTC m=+151.873477506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.160644 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fpl7m" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.291458 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.292062 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.792046196 +0000 UTC m=+152.007707146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.396131 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.396415 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.896398747 +0000 UTC m=+152.112059697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.396483 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.396804 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:32.896797247 +0000 UTC m=+152.112458197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.437428 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wf4wz" podStartSLOduration=131.437409406 podStartE2EDuration="2m11.437409406s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.434902905 +0000 UTC m=+151.650563855" watchObservedRunningTime="2025-12-05 00:25:32.437409406 +0000 UTC m=+151.653070356" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.501838 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.502238 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.002217145 +0000 UTC m=+152.217878095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.602984 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.603398 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.103382788 +0000 UTC m=+152.319043748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.609633 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8p7g7" podStartSLOduration=130.609617621 podStartE2EDuration="2m10.609617621s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.60833658 +0000 UTC m=+151.823997530" watchObservedRunningTime="2025-12-05 00:25:32.609617621 +0000 UTC m=+151.825278571" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.611759 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9pz94" podStartSLOduration=12.611738672 podStartE2EDuration="12.611738672s" podCreationTimestamp="2025-12-05 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.506186482 +0000 UTC m=+151.721847442" watchObservedRunningTime="2025-12-05 00:25:32.611738672 +0000 UTC m=+151.827399622" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.706221 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.706687 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.206669055 +0000 UTC m=+152.422330015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.833874 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.834254 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.334241862 +0000 UTC m=+152.549902812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.897230 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hzds" podStartSLOduration=130.897209655 podStartE2EDuration="2m10.897209655s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.711862691 +0000 UTC m=+151.927523641" watchObservedRunningTime="2025-12-05 00:25:32.897209655 +0000 UTC m=+152.112870605" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.898439 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" podStartSLOduration=130.898434775 podStartE2EDuration="2m10.898434775s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.896437626 +0000 UTC m=+152.112098576" watchObservedRunningTime="2025-12-05 00:25:32.898434775 +0000 UTC m=+152.114095725" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.904000 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.904047 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.911136 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.911185 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.935000 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:32 crc kubenswrapper[4759]: E1205 00:25:32.935330 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.435301043 +0000 UTC m=+152.650961983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:32 crc kubenswrapper[4759]: W1205 00:25:32.966454 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ab30578e1645c6ae4a3aa18e9a4318aa9cd929dbf957b0b7f349d55100b7078b WatchSource:0}: Error finding container ab30578e1645c6ae4a3aa18e9a4318aa9cd929dbf957b0b7f349d55100b7078b: Status 404 returned error can't find the container with id ab30578e1645c6ae4a3aa18e9a4318aa9cd929dbf957b0b7f349d55100b7078b Dec 05 00:25:32 crc kubenswrapper[4759]: I1205 00:25:32.967480 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vm2fd" podStartSLOduration=130.967466117 podStartE2EDuration="2m10.967466117s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.965739245 +0000 UTC m=+152.181400185" watchObservedRunningTime="2025-12-05 00:25:32.967466117 +0000 UTC m=+152.183127067" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.018034 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.018206 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.024357 4759 patch_prober.go:28] interesting pod/console-f9d7485db-g8mxq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.024404 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g8mxq" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.031988 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29414880-vdbj2" podStartSLOduration=132.031973648 podStartE2EDuration="2m12.031973648s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.025647034 +0000 UTC m=+152.241308004" watchObservedRunningTime="2025-12-05 00:25:33.031973648 +0000 UTC m=+152.247634598" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.032621 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9c5t7" podStartSLOduration=131.032614224 podStartE2EDuration="2m11.032614224s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:32.989906033 +0000 UTC m=+152.205566983" watchObservedRunningTime="2025-12-05 00:25:33.032614224 +0000 UTC m=+152.248275174" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.038499 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.039044 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.539029139 +0000 UTC m=+152.754690089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.129147 4759 patch_prober.go:28] interesting pod/console-operator-58897d9998-vtldn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.129196 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vtldn" podUID="f27f029e-62d3-497f-8f69-a95229ebe945" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.129273 4759 patch_prober.go:28] interesting pod/console-operator-58897d9998-vtldn container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.129322 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-vtldn" podUID="f27f029e-62d3-497f-8f69-a95229ebe945" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.140010 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.140434 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.64041958 +0000 UTC m=+152.856080530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.243861 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.244215 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.744202477 +0000 UTC m=+152.959863427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.311945 4759 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6tr79 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.311965 4759 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6tr79 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.311997 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" podUID="03b74b39-100a-4fe4-8bae-0f2088728e24" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.311997 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" podUID="03b74b39-100a-4fe4-8bae-0f2088728e24" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.317055 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2zx9b" podStartSLOduration=131.317038992 podStartE2EDuration="2m11.317038992s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.25871008 +0000 UTC m=+152.474371040" watchObservedRunningTime="2025-12-05 00:25:33.317038992 +0000 UTC m=+152.532699942" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.351282 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" event={"ID":"11f20e7f-9d7e-4574-928a-42697c2fdb81","Type":"ContainerStarted","Data":"487429f0cd17153d73c14cf2f5ba1854fd4bb3b6e8190d5a740f28f9ce9c56e0"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.351363 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-drwnh" podStartSLOduration=131.351340516 podStartE2EDuration="2m11.351340516s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.351283895 +0000 UTC m=+152.566944845" watchObservedRunningTime="2025-12-05 00:25:33.351340516 +0000 UTC m=+152.567001466" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.351940 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.352043 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.852012813 +0000 UTC m=+153.067673763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.352138 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.357478 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.857436426 +0000 UTC m=+153.073097376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.360459 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" event={"ID":"fc1ec533-684e-4f58-8861-adc357bf448e","Type":"ContainerStarted","Data":"cc8b66267539ed64fd5357dd35f5680f4202b422c4387a2198c07f812f5eab22"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.364011 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" event={"ID":"0d64b1f1-f632-4956-a9d8-83703ef96ca1","Type":"ContainerStarted","Data":"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.364981 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.375456 4759 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-p6l5s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.375517 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.383968 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" event={"ID":"2183c13d-b67c-403a-8723-8c62e3ad57f3","Type":"ContainerStarted","Data":"1235bb2a1256da9c88899576dc720ec8692b1758876472f4efd225eee43a5296"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.399098 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dpv8g" podStartSLOduration=131.399080499 podStartE2EDuration="2m11.399080499s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.39745721 +0000 UTC m=+152.613118160" watchObservedRunningTime="2025-12-05 00:25:33.399080499 +0000 UTC m=+152.614741449" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.403552 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1d80fb4bc5e12bcea7130b33031293b92aae1731b27d9e9451e1248bf8612fb0"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.407147 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab30578e1645c6ae4a3aa18e9a4318aa9cd929dbf957b0b7f349d55100b7078b"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.446378 4759 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-p6l5s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.446431 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.454968 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.455599 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:33.955579706 +0000 UTC m=+153.171240656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.467433 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8pqf" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.470619 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" event={"ID":"08f680e4-29bc-4ffc-962b-1a3151e5e41f","Type":"ContainerStarted","Data":"eeffaaed9794c9500f07e6bcf7765db6e4af8bab2fc7a9bf171164900b4915c6"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.482818 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mhbwk" event={"ID":"650c0e29-8158-4c6a-9b4f-2d6705ca4e87","Type":"ContainerStarted","Data":"0b598730030aea0c6d9d596eba2ddb95b2e2601b7a395219528f6a86827d6f47"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.505231 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" event={"ID":"0ae425cf-9dc3-471e-a2b8-506eedb29c8d","Type":"ContainerStarted","Data":"d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.506123 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.507712 4759 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7hqjc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.507775 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.508813 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" event={"ID":"ff68e70c-0561-4324-8cd4-1c8897cff45b","Type":"ContainerStarted","Data":"26b71467a3e63c82de8b9c697b7e8b07b1d95aacbc3c735c21c610e1491f71fd"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.509689 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" event={"ID":"a6019120-bf7b-47df-9e54-c7761066eb48","Type":"ContainerStarted","Data":"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.510486 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.511371 4759 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vtnn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.511427 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.522832 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" podStartSLOduration=131.522812594 podStartE2EDuration="2m11.522812594s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.44672621 +0000 UTC m=+152.662387160" watchObservedRunningTime="2025-12-05 00:25:33.522812594 +0000 UTC m=+152.738473544" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.569013 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.569551 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" event={"ID":"9ba656f8-77bb-4402-8242-6fe3b116a8cc","Type":"ContainerStarted","Data":"7de19b6e6b0c37f09a38cb5eb6356e72eb94a1f7f8996bad2546224cda89f8bd"} Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.573257 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.073231242 +0000 UTC m=+153.288892192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.593719 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" event={"ID":"84b8f271-fcc3-4014-8a36-3e7019bef7c5","Type":"ContainerStarted","Data":"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.599030 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" podStartSLOduration=132.599011129 podStartE2EDuration="2m12.599011129s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.598814515 +0000 UTC m=+152.814475495" watchObservedRunningTime="2025-12-05 00:25:33.599011129 +0000 UTC m=+152.814672079" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.599174 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mhbwk" podStartSLOduration=131.599166763 podStartE2EDuration="2m11.599166763s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.554893735 +0000 UTC m=+152.770554685" watchObservedRunningTime="2025-12-05 00:25:33.599166763 +0000 UTC m=+152.814827703" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.599248 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.630080 4759 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lr9vd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.630413 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.631619 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.631662 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.631907 4759 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vtnn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.631960 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.631981 4759 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vtnn container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.632019 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.654355 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.654413 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.655490 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podStartSLOduration=131.655474414 podStartE2EDuration="2m11.655474414s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.654137592 +0000 UTC m=+152.869798542" watchObservedRunningTime="2025-12-05 00:25:33.655474414 +0000 UTC m=+152.871135364" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.671692 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.672699 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.172668674 +0000 UTC m=+153.388329634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.672745 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" event={"ID":"f2b40743-a414-4dd8-9613-0bc14b937e3d","Type":"ContainerStarted","Data":"29064afbb35ccf9cbaed98c8d3ba94b85efa2cb12f48d7033224a2a7071f660b"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.687707 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.692783 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.192764733 +0000 UTC m=+153.408425773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.693278 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" event={"ID":"2031118b-50a5-4d09-afab-37e601a1631e","Type":"ContainerStarted","Data":"bfc53a98ec5a2325affb745838480f134b0f46e487e599704c09443ef4f4492a"} Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.699797 4759 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6tr79 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.699864 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" podUID="03b74b39-100a-4fe4-8bae-0f2088728e24" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.699938 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.699985 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.777849 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" podStartSLOduration=131.777827115 podStartE2EDuration="2m11.777827115s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.777708092 +0000 UTC m=+152.993369042" watchObservedRunningTime="2025-12-05 00:25:33.777827115 +0000 UTC m=+152.993488085" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.780174 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" podStartSLOduration=132.780159882 podStartE2EDuration="2m12.780159882s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.725479509 +0000 UTC m=+152.941140469" watchObservedRunningTime="2025-12-05 00:25:33.780159882 +0000 UTC m=+152.995820832" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.789364 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.790699 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.290678398 +0000 UTC m=+153.506339348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.865939 4759 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7hqjc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.866002 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.893093 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.893521 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.393506632 +0000 UTC m=+153.609167582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:33 crc kubenswrapper[4759]: I1205 00:25:33.994589 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:33 crc kubenswrapper[4759]: E1205 00:25:33.995425 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.495405404 +0000 UTC m=+153.711066354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.018899 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ftl6j" podStartSLOduration=132.018875866 podStartE2EDuration="2m12.018875866s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:33.819453669 +0000 UTC m=+153.035114629" watchObservedRunningTime="2025-12-05 00:25:34.018875866 +0000 UTC m=+153.234536816" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.101280 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.101690 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.601673133 +0000 UTC m=+153.817334083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.103464 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gng7x" podStartSLOduration=132.103444126 podStartE2EDuration="2m12.103444126s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.011603679 +0000 UTC m=+153.227264629" watchObservedRunningTime="2025-12-05 00:25:34.103444126 +0000 UTC m=+153.319105076" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.105870 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.110214 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.117320 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.120651 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.125112 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.126398 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: W1205 00:25:34.149632 4759 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.149696 4759 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.177388 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.204465 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.204993 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.205023 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.205057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7m4\" (UniqueName: \"kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.205110 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.205141 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.205183 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfqj\" (UniqueName: \"kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.205292 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.705273266 +0000 UTC m=+153.920934216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.307876 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.307946 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.307999 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7m4\" (UniqueName: \"kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.308033 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.308069 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.308097 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.308145 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfqj\" (UniqueName: \"kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.308592 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.308881 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.808862179 +0000 UTC m=+154.024523129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.309052 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.309358 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.309610 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.337905 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.351999 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.364383 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.407171 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfqj\" (UniqueName: \"kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj\") pod \"certified-operators-lgc9q\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.407186 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7m4\" (UniqueName: \"kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4\") pod \"community-operators-pb4c2\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.408650 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.409027 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:34.909010519 +0000 UTC m=+154.124671469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.437156 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.437235 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.437846 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vtldn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.451417 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.510547 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.511176 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.511219 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2w4d\" (UniqueName: \"kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.511252 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.511277 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.511563 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.011549927 +0000 UTC m=+154.227210877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.511865 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.542563 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.620844 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.621203 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.121188737 +0000 UTC m=+154.336849687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621248 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621279 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621321 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2w4d\" (UniqueName: \"kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621347 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg72c\" (UniqueName: \"kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621372 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621390 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.621413 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.622189 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.622468 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.122460338 +0000 UTC m=+154.338121288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.623017 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.652459 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:34 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:34 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:34 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.652515 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.703999 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2w4d\" (UniqueName: \"kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d\") pod \"certified-operators-7fvzn\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.717052 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" event={"ID":"09216400-991e-47e9-8494-ff24c8968a33","Type":"ContainerStarted","Data":"12f2234a22f8527603443ec763a525a397664c3843398132f0af392597c2541c"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.722699 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.722912 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg72c\" (UniqueName: \"kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.722956 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.723055 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.223027407 +0000 UTC m=+154.438688367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.723088 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.723122 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.723403 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.223395686 +0000 UTC m=+154.439056636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.723416 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.723756 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.724360 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9knl" event={"ID":"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69","Type":"ContainerStarted","Data":"5007d4c30e09281eed215f09e2586debff449b6c989cb50c804524a9083d1cec"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.724384 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9knl" event={"ID":"70145bc3-1bf0-48e3-ab5d-84b0e04dfd69","Type":"ContainerStarted","Data":"6466184d0584f476a79751ad7254e147d7b653bb381f48baf09d664bc9c508e6"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.725771 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.727374 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" event={"ID":"adb9d332-b13b-456d-9d04-32124d387a36","Type":"ContainerStarted","Data":"d671a230b3906cdfad78085ca1a95635367c77a144886cbbb7467e34de878814"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.755292 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" event={"ID":"cb7e479b-65fb-45dc-bf4b-cda530317c77","Type":"ContainerStarted","Data":"a25187a5536675427400d22e045e499d6312056300d709689df1c3ced0e18d96"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.767326 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f9knl" podStartSLOduration=14.767286466 podStartE2EDuration="14.767286466s" podCreationTimestamp="2025-12-05 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.764525349 +0000 UTC m=+153.980186299" watchObservedRunningTime="2025-12-05 00:25:34.767286466 +0000 UTC m=+153.982947416" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.767736 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg72c\" (UniqueName: \"kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c\") pod \"community-operators-8t4s6\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.798794 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" event={"ID":"ca2cc56f-b0a3-4e49-a9dd-c8918810f423","Type":"ContainerStarted","Data":"5173c53917b0f13e0274b532347534afa1ba3c4e7bc031cd6ff772b0dcdc8e97"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.824180 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.824628 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.324611021 +0000 UTC m=+154.540271971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.835322 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wwwmq" podStartSLOduration=132.835284352 podStartE2EDuration="2m12.835284352s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.814681139 +0000 UTC m=+154.030342089" watchObservedRunningTime="2025-12-05 00:25:34.835284352 +0000 UTC m=+154.050945302" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.859836 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" event={"ID":"2183c13d-b67c-403a-8723-8c62e3ad57f3","Type":"ContainerStarted","Data":"b1f0f0713bdae3dda7bd6eaea72265f2e26ff8984bb84565b1a591b82402a784"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.860721 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.883806 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" event={"ID":"02d7ef91-e7f4-496d-bc04-6a9a4a5ea6d4","Type":"ContainerStarted","Data":"5c3fc98a588c18563fa4563587b631c4483538727651573681da64d0d781de4e"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.884533 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.891618 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.905671 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" event={"ID":"ff68e70c-0561-4324-8cd4-1c8897cff45b","Type":"ContainerStarted","Data":"f6d2ec7e0b2e18ff17ffdbce7008d92bcfa1a7703aedb8ccfa0ed6665c3b8842"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.906529 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" podStartSLOduration=132.906511777 podStartE2EDuration="2m12.906511777s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.905884662 +0000 UTC m=+154.121545642" watchObservedRunningTime="2025-12-05 00:25:34.906511777 +0000 UTC m=+154.122172727" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.927448 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:34 crc kubenswrapper[4759]: E1205 00:25:34.929463 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.429452025 +0000 UTC m=+154.645112975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.936062 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" podStartSLOduration=133.936043096 podStartE2EDuration="2m13.936043096s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.932948481 +0000 UTC m=+154.148609441" watchObservedRunningTime="2025-12-05 00:25:34.936043096 +0000 UTC m=+154.151704056" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.938466 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" event={"ID":"bbbc5689-2333-4227-984e-57e82e237746","Type":"ContainerStarted","Data":"4a305a306b9be95e14dafa1d49191ed5582da48514924b895106648d62a0cb85"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.969584 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"00eb13ffaee2fc6ab69f97439e94d9357325f76eff9209ad86b95d831b217477"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.996706 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pc8pz" podStartSLOduration=132.996690523 podStartE2EDuration="2m12.996690523s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:34.994448739 +0000 UTC m=+154.210109709" watchObservedRunningTime="2025-12-05 00:25:34.996690523 +0000 UTC m=+154.212351473" Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.997240 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a814d176e48af0f89e53c815597720f8c242c728479fd78e7c14549400de5648"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.997269 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ecadb35a5a683511c941fece708015d9cc47752c08275bd3098febc9a3cd88cd"} Dec 05 00:25:34 crc kubenswrapper[4759]: I1205 00:25:34.997597 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.028883 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" event={"ID":"40f30299-1808-43a6-83db-44e27fa0b18e","Type":"ContainerStarted","Data":"706c3a5f844e08a687447a766a9519951a6657dd754fc2188049ae6eae0d0006"} Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.029387 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.029848 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.52982614 +0000 UTC m=+154.745487090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.055095 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" podStartSLOduration=133.055074515 podStartE2EDuration="2m13.055074515s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.030171808 +0000 UTC m=+154.245832768" watchObservedRunningTime="2025-12-05 00:25:35.055074515 +0000 UTC m=+154.270735465" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.069676 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b5c939597be69355769a3f52a89aa768d3624add23f2aa254b2010bc6361acd6"} Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.076710 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ppbrk" podStartSLOduration=133.076688882 podStartE2EDuration="2m13.076688882s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.075977054 +0000 UTC m=+154.291638004" watchObservedRunningTime="2025-12-05 00:25:35.076688882 +0000 UTC m=+154.292349822" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.116936 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" event={"ID":"11f20e7f-9d7e-4574-928a-42697c2fdb81","Type":"ContainerStarted","Data":"3312f3299d3620478caffb5f493b7bec0e83e55b0b77f3886190f7c2e36c0c36"} Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.136709 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.138883 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.638870736 +0000 UTC m=+154.854531686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.204048 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mqdv9" podStartSLOduration=134.204025883 podStartE2EDuration="2m14.204025883s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.156174147 +0000 UTC m=+154.371835097" watchObservedRunningTime="2025-12-05 00:25:35.204025883 +0000 UTC m=+154.419686833" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.207153 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" event={"ID":"e8a2cc0c-f6ac-46c5-9d42-5859ec38b291","Type":"ContainerStarted","Data":"f4ede16cc03ca9940e743c195a3faff5c50eb8b9a6162e7af97efdc693933317"} Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.238407 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.238550 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.738525573 +0000 UTC m=+154.954186523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.239162 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.239546 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.739529768 +0000 UTC m=+154.955190718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.248869 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.257472 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6d754" podStartSLOduration=133.257452645 podStartE2EDuration="2m13.257452645s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.23469077 +0000 UTC m=+154.450351720" watchObservedRunningTime="2025-12-05 00:25:35.257452645 +0000 UTC m=+154.473113595" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.261073 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" event={"ID":"08f680e4-29bc-4ffc-962b-1a3151e5e41f","Type":"ContainerStarted","Data":"11131bae96e184d69e2ff0cd6e2df372531705439695de76c6abd49869bad5a9"} Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.262473 4759 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lr9vd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" start-of-body= Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.262537 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.271202 4759 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vtnn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.271291 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.280771 4759 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7hqjc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.281117 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.304422 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sn2lw" podStartSLOduration=133.304394108 podStartE2EDuration="2m13.304394108s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.289991837 +0000 UTC m=+154.505652787" watchObservedRunningTime="2025-12-05 00:25:35.304394108 +0000 UTC m=+154.520055058" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.345931 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.346335 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.846319319 +0000 UTC m=+155.061980269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.349463 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kcxjh" podStartSLOduration=134.349447165 podStartE2EDuration="2m14.349447165s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:35.348693377 +0000 UTC m=+154.564354317" watchObservedRunningTime="2025-12-05 00:25:35.349447165 +0000 UTC m=+154.565108115" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.449822 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.457571 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:35.957554939 +0000 UTC m=+155.173215889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.480826 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.498046 4759 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/certified-operators-lgc9q" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.498167 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.554883 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.555202 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.055186656 +0000 UTC m=+155.270847606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.611540 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.616028 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.634638 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:35 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:35 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:35 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.634696 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.658041 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.658442 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.158428471 +0000 UTC m=+155.374089421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.728807 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.762505 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.763010 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.262994719 +0000 UTC m=+155.478655669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.798140 4759 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.864136 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.864501 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.36448956 +0000 UTC m=+155.580150510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.913247 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.914566 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.918067 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.935159 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:25:35 crc kubenswrapper[4759]: I1205 00:25:35.964811 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:35 crc kubenswrapper[4759]: E1205 00:25:35.965150 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.465133952 +0000 UTC m=+155.680794892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.068935 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.068990 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ngf7\" (UniqueName: \"kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.069020 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.069051 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.069367 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.569356541 +0000 UTC m=+155.785017491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.170591 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.170760 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.67073053 +0000 UTC m=+155.886391480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.170817 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ngf7\" (UniqueName: \"kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.170872 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.170933 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.171090 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.171535 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.671517318 +0000 UTC m=+155.887178258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.172043 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.172453 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.209191 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ngf7\" (UniqueName: \"kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7\") pod \"redhat-marketplace-5s4sw\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.268194 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.273339 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.273730 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.773703037 +0000 UTC m=+155.989363987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.279348 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerStarted","Data":"148b21c86169043f4dbee07e60e185e607f815ff288c36cf93445c1083ddf8ca"} Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.283139 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.285081 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.291300 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" event={"ID":"adb9d332-b13b-456d-9d04-32124d387a36","Type":"ContainerStarted","Data":"d39738fc953360332da997d19e7e949e9250a83b7bd85bd8d121cc728f291092"} Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.307480 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.309935 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.321480 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" event={"ID":"09216400-991e-47e9-8494-ff24c8968a33","Type":"ContainerStarted","Data":"b239c3811680539811f9884a2646d3ce61a244b98bc665f0a99174210d109c21"} Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.336341 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerStarted","Data":"a81117c2d0a0ecd8d41819212fe00552f64ebda0cc3937d72236714ea4ce3ed4"} Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.338607 4759 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vtnn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.338652 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.343858 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.377032 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.377132 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.377275 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mkk8\" (UniqueName: \"kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.377562 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.381014 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.880919359 +0000 UTC m=+156.096580309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h2xln" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.478186 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.478345 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.478395 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mkk8\" (UniqueName: \"kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.478469 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.478923 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: E1205 00:25:36.479820 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 00:25:36.979794538 +0000 UTC m=+156.195455488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.480612 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.503022 4759 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T00:25:35.798167435Z","Handler":null,"Name":""} Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.507338 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mkk8\" (UniqueName: \"kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8\") pod \"redhat-marketplace-9pqvs\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.553973 4759 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.554019 4759 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.579480 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.626241 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:36 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:36 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:36 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.626316 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.652423 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.694578 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.705271 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.765098 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" podStartSLOduration=135.765083446 podStartE2EDuration="2m15.765083446s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:36.763726393 +0000 UTC m=+155.979387343" watchObservedRunningTime="2025-12-05 00:25:36.765083446 +0000 UTC m=+155.980744396" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.939589 4759 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lr9vd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.940024 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 00:25:36 crc kubenswrapper[4759]: I1205 00:25:36.964525 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qlgnw" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.031983 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.087219 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.088196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.089609 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmz5j\" (UniqueName: \"kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.089656 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.089680 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.089935 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.098744 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.191932 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmz5j\" (UniqueName: \"kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.192629 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.192801 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.193405 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.193580 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.212931 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmz5j\" (UniqueName: \"kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j\") pod \"redhat-operators-rtslc\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.242359 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.243227 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.247123 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.247737 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.288203 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.308846 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.309023 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.339456 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.340072 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.392815 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerStarted","Data":"cbbd617213e1f19021717a58d190fa7c509935774ca593395e241cc0565a1359"} Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.394742 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerStarted","Data":"5e0c1dd225b0bfde680bb78b8181daae73a07235ab6a3eb3ab9506017a8ad278"} Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.395626 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerStarted","Data":"59210c6354afa7ca9987901e0d333d8e7a66b76c7b53d02820e57bd0549ebc71"} Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.397429 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerStarted","Data":"3a3f8b9e7949311ca69089cca3085bbb5fc4d5be80b183af6e3a0ac6acd74fbe"} Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.409761 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.410675 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.410730 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.410831 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.424559 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h2xln\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.461901 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.490705 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.491814 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.511448 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.511812 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.511951 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hgfw\" (UniqueName: \"kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.512011 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.512358 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.565168 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.580197 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.608865 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.613447 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.613501 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hgfw\" (UniqueName: \"kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.613531 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.614273 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.614523 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.627146 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:37 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:37 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:37 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.627215 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.647521 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hgfw\" (UniqueName: \"kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw\") pod \"redhat-operators-cmjn5\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.804581 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.804665 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:37 crc kubenswrapper[4759]: I1205 00:25:37.817105 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.014600 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.217817 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.217865 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.313956 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.335997 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:25:38 crc kubenswrapper[4759]: W1205 00:25:38.347169 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc8f6b7e7_9ce9_4390_b0c2_1d8c78672d22.slice/crio-0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728 WatchSource:0}: Error finding container 0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728: Status 404 returned error can't find the container with id 0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728 Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.444335 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.453688 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" event={"ID":"6171c662-5317-43a1-bc72-e0d9fbe54466","Type":"ContainerStarted","Data":"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.453740 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" event={"ID":"6171c662-5317-43a1-bc72-e0d9fbe54466","Type":"ContainerStarted","Data":"436fcc5104b5b363c1411d3768fcc900d4c0f299f1c54586d8656e5131f1207a"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.454642 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.480509 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" podStartSLOduration=136.480474029 podStartE2EDuration="2m16.480474029s" podCreationTimestamp="2025-12-05 00:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:38.477743102 +0000 UTC m=+157.693404052" watchObservedRunningTime="2025-12-05 00:25:38.480474029 +0000 UTC m=+157.696134979" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.482195 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerStarted","Data":"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.490682 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.500140 4759 generic.go:334] "Generic (PLEG): container finished" podID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerID="f3db18ca2a3788256cefcd1fc0e8cba66a9d7c2fd6334c4b094223868eb7e27b" exitCode=0 Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.500656 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerDied","Data":"f3db18ca2a3788256cefcd1fc0e8cba66a9d7c2fd6334c4b094223868eb7e27b"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.524045 4759 generic.go:334] "Generic (PLEG): container finished" podID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerID="c2f4fb9941d3d511464b337f646e42bc6c07be0c6417e05c04fafcd612ad78ce" exitCode=0 Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.524175 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerDied","Data":"c2f4fb9941d3d511464b337f646e42bc6c07be0c6417e05c04fafcd612ad78ce"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.545274 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerStarted","Data":"d74885a36d2c936d7043a54f886923d6f5fa79570928e0f5bc1f69d9dfc60420"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.567354 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerDied","Data":"151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.567767 4759 generic.go:334] "Generic (PLEG): container finished" podID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerID="151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9" exitCode=0 Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.589139 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerStarted","Data":"782b740eca4b92c2388ce63329b8db2c129b7c90d383955959e7af5372b7be43"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.606423 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22","Type":"ContainerStarted","Data":"0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.629002 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" event={"ID":"09216400-991e-47e9-8494-ff24c8968a33","Type":"ContainerStarted","Data":"b0e3369b2b7019592d37c3d7ede96130910fc4b4c029307bb7b3b8ef10d44db5"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.637625 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:38 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:38 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:38 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.637687 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.666388 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerStarted","Data":"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f"} Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.706752 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6hsqf" podStartSLOduration=18.70673482 podStartE2EDuration="18.70673482s" podCreationTimestamp="2025-12-05 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:38.703063861 +0000 UTC m=+157.918724811" watchObservedRunningTime="2025-12-05 00:25:38.70673482 +0000 UTC m=+157.922395770" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.752366 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.780234 4759 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nljw2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]log ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]etcd ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]etcd-readiness ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 00:25:38 crc kubenswrapper[4759]: [-]informer-sync failed: reason withheld Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/max-in-flight-filter ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 00:25:38 crc kubenswrapper[4759]: [+]shutdown ok Dec 05 00:25:38 crc kubenswrapper[4759]: readyz check failed Dec 05 00:25:38 crc kubenswrapper[4759]: I1205 00:25:38.780311 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" podUID="ca2cc56f-b0a3-4e49-a9dd-c8918810f423" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.172943 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.375427 4759 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ntfxg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]log ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]etcd ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/max-in-flight-filter ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 00:25:39 crc kubenswrapper[4759]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 00:25:39 crc kubenswrapper[4759]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/openshift.io-startinformers ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 00:25:39 crc kubenswrapper[4759]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 00:25:39 crc kubenswrapper[4759]: livez check failed Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.375826 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" podUID="adb9d332-b13b-456d-9d04-32124d387a36" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.625846 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:39 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:39 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:39 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.626108 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.720601 4759 generic.go:334] "Generic (PLEG): container finished" podID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerID="b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a" exitCode=0 Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.720704 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerDied","Data":"b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.724718 4759 generic.go:334] "Generic (PLEG): container finished" podID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerID="b6a3ba4d0cbd3d065c943a377d75528ca33c9d257cfb2da4687bcbdb497154ff" exitCode=0 Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.724813 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerDied","Data":"b6a3ba4d0cbd3d065c943a377d75528ca33c9d257cfb2da4687bcbdb497154ff"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.724855 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerStarted","Data":"4cd819871cdb413a1b25d865c2c97a14ff03280f14f35cc94019feb2cf48aa6f"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.735164 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c605086-ba35-4534-8732-246afbfde953" containerID="992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b" exitCode=0 Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.735247 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerDied","Data":"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.748616 4759 generic.go:334] "Generic (PLEG): container finished" podID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerID="782b740eca4b92c2388ce63329b8db2c129b7c90d383955959e7af5372b7be43" exitCode=0 Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.748672 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerDied","Data":"782b740eca4b92c2388ce63329b8db2c129b7c90d383955959e7af5372b7be43"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.752434 4759 generic.go:334] "Generic (PLEG): container finished" podID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerID="f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f" exitCode=0 Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.752498 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerDied","Data":"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.821499 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22","Type":"ContainerStarted","Data":"8abde2a5f176505405dfa5d1af252abff9f76506009398da01d26a9d71a1d53a"} Dec 05 00:25:39 crc kubenswrapper[4759]: I1205 00:25:39.925157 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.925140676 podStartE2EDuration="2.925140676s" podCreationTimestamp="2025-12-05 00:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:39.923827674 +0000 UTC m=+159.139488644" watchObservedRunningTime="2025-12-05 00:25:39.925140676 +0000 UTC m=+159.140801626" Dec 05 00:25:40 crc kubenswrapper[4759]: I1205 00:25:40.630897 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:40 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:40 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:40 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:40 crc kubenswrapper[4759]: I1205 00:25:40.630963 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:40 crc kubenswrapper[4759]: I1205 00:25:40.878490 4759 generic.go:334] "Generic (PLEG): container finished" podID="c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" containerID="8abde2a5f176505405dfa5d1af252abff9f76506009398da01d26a9d71a1d53a" exitCode=0 Dec 05 00:25:40 crc kubenswrapper[4759]: I1205 00:25:40.879452 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22","Type":"ContainerDied","Data":"8abde2a5f176505405dfa5d1af252abff9f76506009398da01d26a9d71a1d53a"} Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.443422 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.445015 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.447494 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.452392 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.458421 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.554698 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.554756 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.624112 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:41 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:41 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:41 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.624176 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.656390 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.656445 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.656516 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.691284 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.792219 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.891239 4759 generic.go:334] "Generic (PLEG): container finished" podID="9ba656f8-77bb-4402-8242-6fe3b116a8cc" containerID="7de19b6e6b0c37f09a38cb5eb6356e72eb94a1f7f8996bad2546224cda89f8bd" exitCode=0 Dec 05 00:25:41 crc kubenswrapper[4759]: I1205 00:25:41.891539 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" event={"ID":"9ba656f8-77bb-4402-8242-6fe3b116a8cc","Type":"ContainerDied","Data":"7de19b6e6b0c37f09a38cb5eb6356e72eb94a1f7f8996bad2546224cda89f8bd"} Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.190667 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.472413 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.576562 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access\") pod \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.577064 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir\") pod \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\" (UID: \"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22\") " Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.577463 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" (UID: "c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.594967 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" (UID: "c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.626371 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:42 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:42 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:42 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.626451 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.678099 4759 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.678137 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.809265 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.815490 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ntfxg" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.902957 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.903011 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.903383 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.903404 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.996685 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc","Type":"ContainerStarted","Data":"8f49a14828950d5ae134fb3110ab8271c0e70e4a6371f758ec0ef80768211c18"} Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.999110 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.999533 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22","Type":"ContainerDied","Data":"0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728"} Dec 05 00:25:42 crc kubenswrapper[4759]: I1205 00:25:42.999594 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abea7471196555e99f62bada17f9216016ce823798e9e57ba7b41a79d178728" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.017834 4759 patch_prober.go:28] interesting pod/console-f9d7485db-g8mxq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.017880 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g8mxq" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.223689 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nljw2" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.317204 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6tr79" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.626202 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:43 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:43 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:43 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.626281 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.657039 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:25:43 crc kubenswrapper[4759]: I1205 00:25:43.918624 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.010519 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.010508 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw" event={"ID":"9ba656f8-77bb-4402-8242-6fe3b116a8cc","Type":"ContainerDied","Data":"ff6788171f883e26326545b2543f8e46cf6ce186f4e6d0dd47c76639e92beb8e"} Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.010663 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff6788171f883e26326545b2543f8e46cf6ce186f4e6d0dd47c76639e92beb8e" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.012814 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc","Type":"ContainerStarted","Data":"8754b4dc9a28b5af432fb148ea52fe43343d6ed3de28a2c85da36efdeea9f59b"} Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.026360 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.02634606 podStartE2EDuration="3.02634606s" podCreationTimestamp="2025-12-05 00:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:44.02382221 +0000 UTC m=+163.239483160" watchObservedRunningTime="2025-12-05 00:25:44.02634606 +0000 UTC m=+163.242007010" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.116564 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume\") pod \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.116665 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume\") pod \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.116701 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwmc6\" (UniqueName: \"kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6\") pod \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\" (UID: \"9ba656f8-77bb-4402-8242-6fe3b116a8cc\") " Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.117926 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ba656f8-77bb-4402-8242-6fe3b116a8cc" (UID: "9ba656f8-77bb-4402-8242-6fe3b116a8cc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.137441 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ba656f8-77bb-4402-8242-6fe3b116a8cc" (UID: "9ba656f8-77bb-4402-8242-6fe3b116a8cc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.138848 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6" (OuterVolumeSpecName: "kube-api-access-kwmc6") pod "9ba656f8-77bb-4402-8242-6fe3b116a8cc" (UID: "9ba656f8-77bb-4402-8242-6fe3b116a8cc"). InnerVolumeSpecName "kube-api-access-kwmc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.218010 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ba656f8-77bb-4402-8242-6fe3b116a8cc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.218043 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba656f8-77bb-4402-8242-6fe3b116a8cc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.218054 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwmc6\" (UniqueName: \"kubernetes.io/projected/9ba656f8-77bb-4402-8242-6fe3b116a8cc-kube-api-access-kwmc6\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.625107 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:44 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:44 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:44 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.625484 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.752757 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.761598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6ca2f36-241c-41cb-9d1d-d6856e819953-metrics-certs\") pod \"network-metrics-daemon-ksxg9\" (UID: \"f6ca2f36-241c-41cb-9d1d-d6856e819953\") " pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.787609 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f9knl" Dec 05 00:25:44 crc kubenswrapper[4759]: I1205 00:25:44.882803 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ksxg9" Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.062800 4759 generic.go:334] "Generic (PLEG): container finished" podID="6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" containerID="8754b4dc9a28b5af432fb148ea52fe43343d6ed3de28a2c85da36efdeea9f59b" exitCode=0 Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.062843 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc","Type":"ContainerDied","Data":"8754b4dc9a28b5af432fb148ea52fe43343d6ed3de28a2c85da36efdeea9f59b"} Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.416682 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ksxg9"] Dec 05 00:25:45 crc kubenswrapper[4759]: W1205 00:25:45.424493 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ca2f36_241c_41cb_9d1d_d6856e819953.slice/crio-0339e0287ce9ad4c41d93eddbf6548a4379d210ff6a44d5ffa51c555d40a33c8 WatchSource:0}: Error finding container 0339e0287ce9ad4c41d93eddbf6548a4379d210ff6a44d5ffa51c555d40a33c8: Status 404 returned error can't find the container with id 0339e0287ce9ad4c41d93eddbf6548a4379d210ff6a44d5ffa51c555d40a33c8 Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.625277 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:45 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:45 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:45 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.625371 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:45 crc kubenswrapper[4759]: I1205 00:25:45.942823 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.098155 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" event={"ID":"f6ca2f36-241c-41cb-9d1d-d6856e819953","Type":"ContainerStarted","Data":"0339e0287ce9ad4c41d93eddbf6548a4379d210ff6a44d5ffa51c555d40a33c8"} Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.487042 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.525264 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access\") pod \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.526538 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir\") pod \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\" (UID: \"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc\") " Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.526941 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" (UID: "6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.546298 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" (UID: "6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.627778 4759 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.627839 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.628991 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:46 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:46 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:46 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:46 crc kubenswrapper[4759]: I1205 00:25:46.629056 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.130053 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc","Type":"ContainerDied","Data":"8f49a14828950d5ae134fb3110ab8271c0e70e4a6371f758ec0ef80768211c18"} Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.130090 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.130103 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f49a14828950d5ae134fb3110ab8271c0e70e4a6371f758ec0ef80768211c18" Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.134259 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" event={"ID":"f6ca2f36-241c-41cb-9d1d-d6856e819953","Type":"ContainerStarted","Data":"d32433c46eb6c2ea410bad1469d5732d468724d8e020cdf9899f2d68e9fc4f73"} Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.134294 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ksxg9" event={"ID":"f6ca2f36-241c-41cb-9d1d-d6856e819953","Type":"ContainerStarted","Data":"504b26d2c6c2baf1c58069c7d2a775dcb76e95828c090d595829bbaf074852e7"} Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.156227 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ksxg9" podStartSLOduration=146.156208146 podStartE2EDuration="2m26.156208146s" podCreationTimestamp="2025-12-05 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:25:47.155520639 +0000 UTC m=+166.371181589" watchObservedRunningTime="2025-12-05 00:25:47.156208146 +0000 UTC m=+166.371869106" Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.628636 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:47 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:47 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:47 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:47 crc kubenswrapper[4759]: I1205 00:25:47.628698 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:48 crc kubenswrapper[4759]: I1205 00:25:48.625320 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:48 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:48 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:48 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:48 crc kubenswrapper[4759]: I1205 00:25:48.625378 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:49 crc kubenswrapper[4759]: I1205 00:25:49.628177 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:49 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:49 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:49 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:49 crc kubenswrapper[4759]: I1205 00:25:49.628288 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:50 crc kubenswrapper[4759]: I1205 00:25:50.624482 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:50 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:50 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:50 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:50 crc kubenswrapper[4759]: I1205 00:25:50.624890 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:51 crc kubenswrapper[4759]: I1205 00:25:51.626830 4759 patch_prober.go:28] interesting pod/router-default-5444994796-mhbwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 00:25:51 crc kubenswrapper[4759]: [-]has-synced failed: reason withheld Dec 05 00:25:51 crc kubenswrapper[4759]: [+]process-running ok Dec 05 00:25:51 crc kubenswrapper[4759]: healthz check failed Dec 05 00:25:51 crc kubenswrapper[4759]: I1205 00:25:51.626884 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mhbwk" podUID="650c0e29-8158-4c6a-9b4f-2d6705ca4e87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.625129 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.629513 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mhbwk" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903142 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903191 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903236 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903254 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903381 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903893 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"01ed31f23320651f59fd3c2115392c162c0e0d3528addccf0d3137478b4b1459"} pod="openshift-console/downloads-7954f5f757-wf4wz" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.903971 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" containerID="cri-o://01ed31f23320651f59fd3c2115392c162c0e0d3528addccf0d3137478b4b1459" gracePeriod=2 Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.904034 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:25:52 crc kubenswrapper[4759]: I1205 00:25:52.904059 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:25:53 crc kubenswrapper[4759]: I1205 00:25:53.017373 4759 patch_prober.go:28] interesting pod/console-f9d7485db-g8mxq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 05 00:25:53 crc kubenswrapper[4759]: I1205 00:25:53.017435 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g8mxq" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 05 00:25:54 crc kubenswrapper[4759]: I1205 00:25:54.204380 4759 generic.go:334] "Generic (PLEG): container finished" podID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerID="01ed31f23320651f59fd3c2115392c162c0e0d3528addccf0d3137478b4b1459" exitCode=0 Dec 05 00:25:54 crc kubenswrapper[4759]: I1205 00:25:54.204487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wf4wz" event={"ID":"f99bc61a-b820-4ebd-8ed0-d18cba6c017a","Type":"ContainerDied","Data":"01ed31f23320651f59fd3c2115392c162c0e0d3528addccf0d3137478b4b1459"} Dec 05 00:25:57 crc kubenswrapper[4759]: I1205 00:25:57.614184 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:26:02 crc kubenswrapper[4759]: I1205 00:26:02.904195 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:02 crc kubenswrapper[4759]: I1205 00:26:02.904665 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:03 crc kubenswrapper[4759]: I1205 00:26:03.021559 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:26:03 crc kubenswrapper[4759]: I1205 00:26:03.025527 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:26:04 crc kubenswrapper[4759]: I1205 00:26:04.433389 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:26:04 crc kubenswrapper[4759]: I1205 00:26:04.433721 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:26:10 crc kubenswrapper[4759]: I1205 00:26:10.405423 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 00:26:12 crc kubenswrapper[4759]: I1205 00:26:12.297203 4759 generic.go:334] "Generic (PLEG): container finished" podID="45df7d8a-597a-42b0-8116-37bf7d3e7627" containerID="856838aa559f79a83b467a824dfdf54073acf1614c264906d5b28fa17b422af0" exitCode=0 Dec 05 00:26:12 crc kubenswrapper[4759]: I1205 00:26:12.297246 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29414880-vdbj2" event={"ID":"45df7d8a-597a-42b0-8116-37bf7d3e7627","Type":"ContainerDied","Data":"856838aa559f79a83b467a824dfdf54073acf1614c264906d5b28fa17b422af0"} Dec 05 00:26:12 crc kubenswrapper[4759]: I1205 00:26:12.903375 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:12 crc kubenswrapper[4759]: I1205 00:26:12.903763 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:13 crc kubenswrapper[4759]: I1205 00:26:13.621204 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-66mwb" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421084 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 00:26:14 crc kubenswrapper[4759]: E1205 00:26:14.421353 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba656f8-77bb-4402-8242-6fe3b116a8cc" containerName="collect-profiles" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421368 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba656f8-77bb-4402-8242-6fe3b116a8cc" containerName="collect-profiles" Dec 05 00:26:14 crc kubenswrapper[4759]: E1205 00:26:14.421382 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421389 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: E1205 00:26:14.421408 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421414 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421525 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f6b7e7-9ce9-4390-b0c2-1d8c78672d22" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421534 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba656f8-77bb-4402-8242-6fe3b116a8cc" containerName="collect-profiles" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421550 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d69b4d2-5aa0-4183-b3fc-53b3c649e1bc" containerName="pruner" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.421958 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.429165 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.429229 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.430747 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.430985 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.438565 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.529989 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.530090 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.530095 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.547258 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:14 crc kubenswrapper[4759]: I1205 00:26:14.753681 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.023895 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.025121 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.031124 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.203716 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.203807 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.204040 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.305407 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.305237 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.305571 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.306129 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.306250 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.337169 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access\") pod \"installer-9-crc\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:19 crc kubenswrapper[4759]: I1205 00:26:19.344951 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:26:21 crc kubenswrapper[4759]: E1205 00:26:21.428380 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 00:26:21 crc kubenswrapper[4759]: E1205 00:26:21.429891 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qg72c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8t4s6_openshift-marketplace(7382abb9-f18c-4d5e-90c4-aecf34f4a2d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:21 crc kubenswrapper[4759]: E1205 00:26:21.431402 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8t4s6" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" Dec 05 00:26:22 crc kubenswrapper[4759]: I1205 00:26:22.902931 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:22 crc kubenswrapper[4759]: I1205 00:26:22.903191 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:24 crc kubenswrapper[4759]: E1205 00:26:24.008866 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8t4s6" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.055231 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.163377 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5b2s\" (UniqueName: \"kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s\") pod \"45df7d8a-597a-42b0-8116-37bf7d3e7627\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.163482 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca\") pod \"45df7d8a-597a-42b0-8116-37bf7d3e7627\" (UID: \"45df7d8a-597a-42b0-8116-37bf7d3e7627\") " Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.164202 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca" (OuterVolumeSpecName: "serviceca") pod "45df7d8a-597a-42b0-8116-37bf7d3e7627" (UID: "45df7d8a-597a-42b0-8116-37bf7d3e7627"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.174142 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s" (OuterVolumeSpecName: "kube-api-access-g5b2s") pod "45df7d8a-597a-42b0-8116-37bf7d3e7627" (UID: "45df7d8a-597a-42b0-8116-37bf7d3e7627"). InnerVolumeSpecName "kube-api-access-g5b2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.264769 4759 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/45df7d8a-597a-42b0-8116-37bf7d3e7627-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.264817 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5b2s\" (UniqueName: \"kubernetes.io/projected/45df7d8a-597a-42b0-8116-37bf7d3e7627-kube-api-access-g5b2s\") on node \"crc\" DevicePath \"\"" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.365491 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29414880-vdbj2" event={"ID":"45df7d8a-597a-42b0-8116-37bf7d3e7627","Type":"ContainerDied","Data":"39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6"} Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.365534 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ae55bd0855b03ac8c164ea0c1e28dda9c11d3be203ec80acd9f1d587da39a6" Dec 05 00:26:24 crc kubenswrapper[4759]: I1205 00:26:24.365547 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29414880-vdbj2" Dec 05 00:26:24 crc kubenswrapper[4759]: E1205 00:26:24.866969 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 00:26:24 crc kubenswrapper[4759]: E1205 00:26:24.867254 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn7m4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pb4c2_openshift-marketplace(031753f7-0b97-45ec-8e24-a6aeafb09d65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:24 crc kubenswrapper[4759]: E1205 00:26:24.869606 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pb4c2" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" Dec 05 00:26:30 crc kubenswrapper[4759]: E1205 00:26:30.226091 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pb4c2" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" Dec 05 00:26:30 crc kubenswrapper[4759]: E1205 00:26:30.416474 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 00:26:30 crc kubenswrapper[4759]: E1205 00:26:30.416653 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmz5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rtslc_openshift-marketplace(eac9e47c-1b1d-4b22-9040-3a198c5758fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:30 crc kubenswrapper[4759]: E1205 00:26:30.418226 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rtslc" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.453250 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rtslc" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.515492 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.515684 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2ngf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5s4sw_openshift-marketplace(7c605086-ba35-4534-8732-246afbfde953): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.517498 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5s4sw" podUID="7c605086-ba35-4534-8732-246afbfde953" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.562996 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.563146 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hgfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cmjn5_openshift-marketplace(1dc5ee97-3aec-41e6-be6e-c479b3038dd6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:31 crc kubenswrapper[4759]: E1205 00:26:31.564297 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cmjn5" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" Dec 05 00:26:32 crc kubenswrapper[4759]: I1205 00:26:32.903056 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:32 crc kubenswrapper[4759]: I1205 00:26:32.903422 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.039501 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cmjn5" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.039496 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5s4sw" podUID="7c605086-ba35-4534-8732-246afbfde953" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.123403 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.123773 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mkk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9pqvs_openshift-marketplace(628054fc-dfa7-402e-8bd0-d56eed57b9fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.125242 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9pqvs" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.129888 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.130006 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2w4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7fvzn_openshift-marketplace(52b6eecd-9d85-46ac-9163-b04da27c2a2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.131273 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7fvzn" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.157453 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.157596 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vfqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lgc9q_openshift-marketplace(39167ad7-8b39-4c2b-b783-88427c69b7eb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.159256 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lgc9q" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.411862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wf4wz" event={"ID":"f99bc61a-b820-4ebd-8ed0-d18cba6c017a","Type":"ContainerStarted","Data":"0ed6bbbfed1d5973959dd2a9d185d3470276d3cd38424e80124b1f23c699a220"} Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.412777 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.413126 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.413163 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.413888 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lgc9q" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.414123 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7fvzn" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" Dec 05 00:26:33 crc kubenswrapper[4759]: E1205 00:26:33.415750 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9pqvs" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.613377 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 00:26:33 crc kubenswrapper[4759]: W1205 00:26:33.625529 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddbabb1ea_7574_4d58_9a61_0982f4d1897a.slice/crio-5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b WatchSource:0}: Error finding container 5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b: Status 404 returned error can't find the container with id 5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b Dec 05 00:26:33 crc kubenswrapper[4759]: I1205 00:26:33.683230 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 00:26:33 crc kubenswrapper[4759]: W1205 00:26:33.704951 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod71bd8cd3_7184_4fa6_931c_b6cc8967911b.slice/crio-27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999 WatchSource:0}: Error finding container 27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999: Status 404 returned error can't find the container with id 27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999 Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.419109 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbabb1ea-7574-4d58-9a61-0982f4d1897a","Type":"ContainerStarted","Data":"5f73efcaa9aaceecce52a5f5e4db4e52894a0b02aa209cc3b6fac154e727e3c4"} Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.419612 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbabb1ea-7574-4d58-9a61-0982f4d1897a","Type":"ContainerStarted","Data":"5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b"} Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.420732 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71bd8cd3-7184-4fa6-931c-b6cc8967911b","Type":"ContainerStarted","Data":"33a655d6f1074f0248fcdebd7ed5c5675f4fd6fae503e6e7133e1550c0a62849"} Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.420776 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71bd8cd3-7184-4fa6-931c-b6cc8967911b","Type":"ContainerStarted","Data":"27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999"} Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.421223 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.421281 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.438317 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.438384 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.438442 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.439113 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.439177 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc" gracePeriod=600 Dec 05 00:26:34 crc kubenswrapper[4759]: I1205 00:26:34.442697 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=15.442681265 podStartE2EDuration="15.442681265s" podCreationTimestamp="2025-12-05 00:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:26:34.438642617 +0000 UTC m=+213.654303587" watchObservedRunningTime="2025-12-05 00:26:34.442681265 +0000 UTC m=+213.658342215" Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.428165 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc" exitCode=0 Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.428359 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc"} Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.428765 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8"} Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.430618 4759 generic.go:334] "Generic (PLEG): container finished" podID="71bd8cd3-7184-4fa6-931c-b6cc8967911b" containerID="33a655d6f1074f0248fcdebd7ed5c5675f4fd6fae503e6e7133e1550c0a62849" exitCode=0 Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.430726 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71bd8cd3-7184-4fa6-931c-b6cc8967911b","Type":"ContainerDied","Data":"33a655d6f1074f0248fcdebd7ed5c5675f4fd6fae503e6e7133e1550c0a62849"} Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.431612 4759 patch_prober.go:28] interesting pod/downloads-7954f5f757-wf4wz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.431662 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wf4wz" podUID="f99bc61a-b820-4ebd-8ed0-d18cba6c017a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 05 00:26:35 crc kubenswrapper[4759]: I1205 00:26:35.452739 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=21.452721534 podStartE2EDuration="21.452721534s" podCreationTimestamp="2025-12-05 00:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:26:34.454378549 +0000 UTC m=+213.670039489" watchObservedRunningTime="2025-12-05 00:26:35.452721534 +0000 UTC m=+214.668382494" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.291189 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.442657 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71bd8cd3-7184-4fa6-931c-b6cc8967911b","Type":"ContainerDied","Data":"27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999"} Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.443161 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27abe169886e9558c45f59baaaf6fcd75d961346dafd99e289faab576205e999" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.443279 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.475199 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access\") pod \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.475523 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir\") pod \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\" (UID: \"71bd8cd3-7184-4fa6-931c-b6cc8967911b\") " Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.475879 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71bd8cd3-7184-4fa6-931c-b6cc8967911b" (UID: "71bd8cd3-7184-4fa6-931c-b6cc8967911b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.498144 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71bd8cd3-7184-4fa6-931c-b6cc8967911b" (UID: "71bd8cd3-7184-4fa6-931c-b6cc8967911b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.577152 4759 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:26:37 crc kubenswrapper[4759]: I1205 00:26:37.577199 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bd8cd3-7184-4fa6-931c-b6cc8967911b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:26:41 crc kubenswrapper[4759]: I1205 00:26:41.476668 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerStarted","Data":"ca201ebcb7de417e1c9f3fe725916d3f43e655bcda5f73088738ed7d28c746b9"} Dec 05 00:26:42 crc kubenswrapper[4759]: I1205 00:26:42.947585 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wf4wz" Dec 05 00:26:43 crc kubenswrapper[4759]: I1205 00:26:43.490900 4759 generic.go:334] "Generic (PLEG): container finished" podID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerID="ca201ebcb7de417e1c9f3fe725916d3f43e655bcda5f73088738ed7d28c746b9" exitCode=0 Dec 05 00:26:43 crc kubenswrapper[4759]: I1205 00:26:43.490950 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerDied","Data":"ca201ebcb7de417e1c9f3fe725916d3f43e655bcda5f73088738ed7d28c746b9"} Dec 05 00:26:43 crc kubenswrapper[4759]: I1205 00:26:43.679119 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:26:44 crc kubenswrapper[4759]: I1205 00:26:44.606021 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerStarted","Data":"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9"} Dec 05 00:26:45 crc kubenswrapper[4759]: I1205 00:26:45.613153 4759 generic.go:334] "Generic (PLEG): container finished" podID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerID="83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9" exitCode=0 Dec 05 00:26:45 crc kubenswrapper[4759]: I1205 00:26:45.613249 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerDied","Data":"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9"} Dec 05 00:26:45 crc kubenswrapper[4759]: I1205 00:26:45.617969 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerStarted","Data":"862c7853b4effc5d135af4af71f2bcb2b199ce067d146a675396382ff975ba1b"} Dec 05 00:26:45 crc kubenswrapper[4759]: I1205 00:26:45.621186 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerStarted","Data":"b7ef183ab9132d71e9a067153b5cee97a5770698735559d8771fa44e1d10f02f"} Dec 05 00:26:45 crc kubenswrapper[4759]: I1205 00:26:45.659981 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8t4s6" podStartSLOduration=5.7573247819999995 podStartE2EDuration="1m11.659965529s" podCreationTimestamp="2025-12-05 00:25:34 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.510352746 +0000 UTC m=+157.726013696" lastFinishedPulling="2025-12-05 00:26:44.412993493 +0000 UTC m=+223.628654443" observedRunningTime="2025-12-05 00:26:45.659024013 +0000 UTC m=+224.874684973" watchObservedRunningTime="2025-12-05 00:26:45.659965529 +0000 UTC m=+224.875626479" Dec 05 00:26:46 crc kubenswrapper[4759]: I1205 00:26:46.629843 4759 generic.go:334] "Generic (PLEG): container finished" podID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerID="862c7853b4effc5d135af4af71f2bcb2b199ce067d146a675396382ff975ba1b" exitCode=0 Dec 05 00:26:46 crc kubenswrapper[4759]: I1205 00:26:46.629914 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerDied","Data":"862c7853b4effc5d135af4af71f2bcb2b199ce067d146a675396382ff975ba1b"} Dec 05 00:26:54 crc kubenswrapper[4759]: I1205 00:26:54.893175 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:26:54 crc kubenswrapper[4759]: I1205 00:26:54.893905 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:26:55 crc kubenswrapper[4759]: I1205 00:26:55.279968 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:26:55 crc kubenswrapper[4759]: I1205 00:26:55.737403 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:26:55 crc kubenswrapper[4759]: I1205 00:26:55.779904 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:26:57 crc kubenswrapper[4759]: I1205 00:26:57.681994 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8t4s6" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="registry-server" containerID="cri-o://b7ef183ab9132d71e9a067153b5cee97a5770698735559d8771fa44e1d10f02f" gracePeriod=2 Dec 05 00:26:58 crc kubenswrapper[4759]: I1205 00:26:58.689506 4759 generic.go:334] "Generic (PLEG): container finished" podID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerID="b7ef183ab9132d71e9a067153b5cee97a5770698735559d8771fa44e1d10f02f" exitCode=0 Dec 05 00:26:58 crc kubenswrapper[4759]: I1205 00:26:58.689635 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerDied","Data":"b7ef183ab9132d71e9a067153b5cee97a5770698735559d8771fa44e1d10f02f"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.067198 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.117334 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities\") pod \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.117455 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg72c\" (UniqueName: \"kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c\") pod \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.117482 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content\") pod \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\" (UID: \"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9\") " Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.121998 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities" (OuterVolumeSpecName: "utilities") pod "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" (UID: "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.134562 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c" (OuterVolumeSpecName: "kube-api-access-qg72c") pod "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" (UID: "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9"). InnerVolumeSpecName "kube-api-access-qg72c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.170423 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" (UID: "7382abb9-f18c-4d5e-90c4-aecf34f4a2d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.218589 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg72c\" (UniqueName: \"kubernetes.io/projected/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-kube-api-access-qg72c\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.218621 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.218630 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.842943 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerStarted","Data":"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.846446 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8t4s6" event={"ID":"7382abb9-f18c-4d5e-90c4-aecf34f4a2d9","Type":"ContainerDied","Data":"148b21c86169043f4dbee07e60e185e607f815ff288c36cf93445c1083ddf8ca"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.846499 4759 scope.go:117] "RemoveContainer" containerID="b7ef183ab9132d71e9a067153b5cee97a5770698735559d8771fa44e1d10f02f" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.846642 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8t4s6" Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.852246 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerStarted","Data":"e83cb4675a2ff55a6f6c4f6da8bd3a7edd0f780e80c7ade403fdce0487502197"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.858992 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerStarted","Data":"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.893148 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerStarted","Data":"a7681ed73a22689b5ac6ffecd793d6da2449f9ca89350c6e55ece69345378756"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.900822 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerStarted","Data":"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.903580 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c605086-ba35-4534-8732-246afbfde953" containerID="f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a" exitCode=0 Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.903619 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerDied","Data":"f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a"} Dec 05 00:27:02 crc kubenswrapper[4759]: I1205 00:27:02.912593 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerStarted","Data":"744b2304556a356aa68ab914b5e2ec67b2747f6ef2587b67fe6f5050f364718a"} Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.031962 4759 scope.go:117] "RemoveContainer" containerID="ca201ebcb7de417e1c9f3fe725916d3f43e655bcda5f73088738ed7d28c746b9" Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.035890 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pb4c2" podStartSLOduration=5.782295085 podStartE2EDuration="1m29.035865974s" podCreationTimestamp="2025-12-05 00:25:34 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.580942035 +0000 UTC m=+157.796602975" lastFinishedPulling="2025-12-05 00:27:01.834512914 +0000 UTC m=+241.050173864" observedRunningTime="2025-12-05 00:27:02.961937637 +0000 UTC m=+242.177598607" watchObservedRunningTime="2025-12-05 00:27:03.035865974 +0000 UTC m=+242.251526924" Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.076923 4759 scope.go:117] "RemoveContainer" containerID="f3db18ca2a3788256cefcd1fc0e8cba66a9d7c2fd6334c4b094223868eb7e27b" Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.078938 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9pqvs" podStartSLOduration=3.789138145 podStartE2EDuration="1m27.078907141s" podCreationTimestamp="2025-12-05 00:25:36 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.598585655 +0000 UTC m=+157.814246605" lastFinishedPulling="2025-12-05 00:27:01.888354651 +0000 UTC m=+241.104015601" observedRunningTime="2025-12-05 00:27:03.069783435 +0000 UTC m=+242.285444385" watchObservedRunningTime="2025-12-05 00:27:03.078907141 +0000 UTC m=+242.294568091" Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.117947 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.125606 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8t4s6"] Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.171715 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" path="/var/lib/kubelet/pods/7382abb9-f18c-4d5e-90c4-aecf34f4a2d9/volumes" Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.925543 4759 generic.go:334] "Generic (PLEG): container finished" podID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerID="8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223" exitCode=0 Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.925615 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerDied","Data":"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223"} Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.932824 4759 generic.go:334] "Generic (PLEG): container finished" podID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerID="e83cb4675a2ff55a6f6c4f6da8bd3a7edd0f780e80c7ade403fdce0487502197" exitCode=0 Dec 05 00:27:03 crc kubenswrapper[4759]: I1205 00:27:03.932855 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerDied","Data":"e83cb4675a2ff55a6f6c4f6da8bd3a7edd0f780e80c7ade403fdce0487502197"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.452541 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.452861 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.940273 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerStarted","Data":"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.943362 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerStarted","Data":"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.946259 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerStarted","Data":"87a1a8dba60dfff88e376723c7cfcaec94c55efd5890d53b7303c6b5cd4dfbad"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.948593 4759 generic.go:334] "Generic (PLEG): container finished" podID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerID="250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107" exitCode=0 Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.948652 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerDied","Data":"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.952814 4759 generic.go:334] "Generic (PLEG): container finished" podID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerID="a7681ed73a22689b5ac6ffecd793d6da2449f9ca89350c6e55ece69345378756" exitCode=0 Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.952845 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerDied","Data":"a7681ed73a22689b5ac6ffecd793d6da2449f9ca89350c6e55ece69345378756"} Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.965032 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5s4sw" podStartSLOduration=4.757884848 podStartE2EDuration="1m29.96501887s" podCreationTimestamp="2025-12-05 00:25:35 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.490285948 +0000 UTC m=+157.705946898" lastFinishedPulling="2025-12-05 00:27:03.69741997 +0000 UTC m=+242.913080920" observedRunningTime="2025-12-05 00:27:04.960265533 +0000 UTC m=+244.175926473" watchObservedRunningTime="2025-12-05 00:27:04.96501887 +0000 UTC m=+244.180679810" Dec 05 00:27:04 crc kubenswrapper[4759]: I1205 00:27:04.984766 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7fvzn" podStartSLOduration=5.139348521 podStartE2EDuration="1m30.98474757s" podCreationTimestamp="2025-12-05 00:25:34 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.676439052 +0000 UTC m=+157.892100002" lastFinishedPulling="2025-12-05 00:27:04.521838101 +0000 UTC m=+243.737499051" observedRunningTime="2025-12-05 00:27:04.981571775 +0000 UTC m=+244.197232745" watchObservedRunningTime="2025-12-05 00:27:04.98474757 +0000 UTC m=+244.200408520" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.497325 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pb4c2" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="registry-server" probeResult="failure" output=< Dec 05 00:27:05 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:27:05 crc kubenswrapper[4759]: > Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.499458 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.499496 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.616972 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.617042 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.962713 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerStarted","Data":"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a"} Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.964811 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerStarted","Data":"8eb710201706ebcda5527009cc00f674d34c3e06bd04a031b6bdadff1cbe0277"} Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.980843 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rtslc" podStartSLOduration=3.344858582 podStartE2EDuration="1m28.980824104s" podCreationTimestamp="2025-12-05 00:25:37 +0000 UTC" firstStartedPulling="2025-12-05 00:25:39.731010368 +0000 UTC m=+158.946671318" lastFinishedPulling="2025-12-05 00:27:05.36697589 +0000 UTC m=+244.582636840" observedRunningTime="2025-12-05 00:27:05.977390292 +0000 UTC m=+245.193051262" watchObservedRunningTime="2025-12-05 00:27:05.980824104 +0000 UTC m=+245.196485054" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.981044 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lgc9q" podStartSLOduration=6.136232961 podStartE2EDuration="1m31.98103905s" podCreationTimestamp="2025-12-05 00:25:34 +0000 UTC" firstStartedPulling="2025-12-05 00:25:38.539383994 +0000 UTC m=+157.755044944" lastFinishedPulling="2025-12-05 00:27:04.384190083 +0000 UTC m=+243.599851033" observedRunningTime="2025-12-05 00:27:05.049721155 +0000 UTC m=+244.265382145" watchObservedRunningTime="2025-12-05 00:27:05.98103905 +0000 UTC m=+245.196700000" Dec 05 00:27:05 crc kubenswrapper[4759]: I1205 00:27:05.998801 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cmjn5" podStartSLOduration=3.351438209 podStartE2EDuration="1m28.998779956s" podCreationTimestamp="2025-12-05 00:25:37 +0000 UTC" firstStartedPulling="2025-12-05 00:25:39.730487785 +0000 UTC m=+158.946148735" lastFinishedPulling="2025-12-05 00:27:05.377829532 +0000 UTC m=+244.593490482" observedRunningTime="2025-12-05 00:27:05.994235664 +0000 UTC m=+245.209896624" watchObservedRunningTime="2025-12-05 00:27:05.998779956 +0000 UTC m=+245.214440916" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.268376 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.268425 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.308275 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.546378 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lgc9q" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="registry-server" probeResult="failure" output=< Dec 05 00:27:06 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:27:06 crc kubenswrapper[4759]: > Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.653061 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.653139 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.653607 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7fvzn" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="registry-server" probeResult="failure" output=< Dec 05 00:27:06 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:27:06 crc kubenswrapper[4759]: > Dec 05 00:27:06 crc kubenswrapper[4759]: I1205 00:27:06.694611 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:27:07 crc kubenswrapper[4759]: I1205 00:27:07.410532 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:27:07 crc kubenswrapper[4759]: I1205 00:27:07.410590 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:27:07 crc kubenswrapper[4759]: I1205 00:27:07.818071 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:27:07 crc kubenswrapper[4759]: I1205 00:27:07.818130 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:27:08 crc kubenswrapper[4759]: I1205 00:27:08.452757 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rtslc" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="registry-server" probeResult="failure" output=< Dec 05 00:27:08 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:27:08 crc kubenswrapper[4759]: > Dec 05 00:27:08 crc kubenswrapper[4759]: I1205 00:27:08.706045 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" containerID="cri-o://543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d" gracePeriod=15 Dec 05 00:27:08 crc kubenswrapper[4759]: I1205 00:27:08.857008 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cmjn5" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="registry-server" probeResult="failure" output=< Dec 05 00:27:08 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:27:08 crc kubenswrapper[4759]: > Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.901695 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.937918 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2"] Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.938391 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="extract-content" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.938546 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="extract-content" Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.938620 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="registry-server" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.938675 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="registry-server" Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.938734 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.938793 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.938860 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45df7d8a-597a-42b0-8116-37bf7d3e7627" containerName="image-pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.938922 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45df7d8a-597a-42b0-8116-37bf7d3e7627" containerName="image-pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.938981 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="extract-utilities" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939033 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="extract-utilities" Dec 05 00:27:10 crc kubenswrapper[4759]: E1205 00:27:10.939095 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bd8cd3-7184-4fa6-931c-b6cc8967911b" containerName="pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939148 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bd8cd3-7184-4fa6-931c-b6cc8967911b" containerName="pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939331 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7382abb9-f18c-4d5e-90c4-aecf34f4a2d9" containerName="registry-server" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939411 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerName="oauth-openshift" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939493 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45df7d8a-597a-42b0-8116-37bf7d3e7627" containerName="image-pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.939553 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bd8cd3-7184-4fa6-931c-b6cc8967911b" containerName="pruner" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.940022 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.948269 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2"] Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.991107 4759 generic.go:334] "Generic (PLEG): container finished" podID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" containerID="543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d" exitCode=0 Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.991162 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.991162 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" event={"ID":"84b8f271-fcc3-4014-8a36-3e7019bef7c5","Type":"ContainerDied","Data":"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d"} Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.991566 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr9vd" event={"ID":"84b8f271-fcc3-4014-8a36-3e7019bef7c5","Type":"ContainerDied","Data":"8bae4a32a2d8ee1d41aec691844fc5cb647da5a4a3805811ad7b04664500adc0"} Dec 05 00:27:10 crc kubenswrapper[4759]: I1205 00:27:10.991594 4759 scope.go:117] "RemoveContainer" containerID="543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.007795 4759 scope.go:117] "RemoveContainer" containerID="543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.008189 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d\": container with ID starting with 543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d not found: ID does not exist" containerID="543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.008229 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d"} err="failed to get container status \"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d\": rpc error: code = NotFound desc = could not find container \"543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d\": container with ID starting with 543333852d94d6827a38e89c9e0aa98176a7168882f7249230d889163886890d not found: ID does not exist" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.044931 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045123 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045158 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045188 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045214 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045231 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbswj\" (UniqueName: \"kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045258 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045277 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.045867 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.046222 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.046431 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.047288 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.046574 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.047210 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.048050 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.047392 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.048163 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.048189 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir\") pod \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\" (UID: \"84b8f271-fcc3-4014-8a36-3e7019bef7c5\") " Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.048250 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.048911 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049010 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-router-certs\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049047 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049182 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-service-ca\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049248 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsnb\" (UniqueName: \"kubernetes.io/projected/d3054fff-52bd-437c-a203-aadefcb88d98-kube-api-access-gtsnb\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049315 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-session\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049345 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049374 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-error\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049426 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-audit-policies\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049454 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049500 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3054fff-52bd-437c-a203-aadefcb88d98-audit-dir\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049531 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049554 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049587 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-login\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049664 4759 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049690 4759 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049706 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049720 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.049733 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.050567 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.051556 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj" (OuterVolumeSpecName: "kube-api-access-kbswj") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "kube-api-access-kbswj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.051646 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.051953 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.052529 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.052785 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.053362 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.057844 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.060555 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "84b8f271-fcc3-4014-8a36-3e7019bef7c5" (UID: "84b8f271-fcc3-4014-8a36-3e7019bef7c5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151099 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3054fff-52bd-437c-a203-aadefcb88d98-audit-dir\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151164 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151194 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151225 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-login\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151271 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151323 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-router-certs\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151348 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151388 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-service-ca\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151515 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsnb\" (UniqueName: \"kubernetes.io/projected/d3054fff-52bd-437c-a203-aadefcb88d98-kube-api-access-gtsnb\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151542 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-session\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151563 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151599 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-error\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151635 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-audit-policies\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151664 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151740 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151764 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151782 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151801 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151819 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151836 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151853 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbswj\" (UniqueName: \"kubernetes.io/projected/84b8f271-fcc3-4014-8a36-3e7019bef7c5-kube-api-access-kbswj\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151870 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.151888 4759 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/84b8f271-fcc3-4014-8a36-3e7019bef7c5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.152352 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3054fff-52bd-437c-a203-aadefcb88d98-audit-dir\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.153414 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-audit-policies\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.153487 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-service-ca\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.153895 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.154011 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.157037 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-session\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.157033 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.157875 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.158921 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.159300 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-router-certs\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.159946 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-error\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.160099 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-user-template-login\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.160103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3054fff-52bd-437c-a203-aadefcb88d98-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.180532 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsnb\" (UniqueName: \"kubernetes.io/projected/d3054fff-52bd-437c-a203-aadefcb88d98-kube-api-access-gtsnb\") pod \"oauth-openshift-74bc74d8d6-fv2r2\" (UID: \"d3054fff-52bd-437c-a203-aadefcb88d98\") " pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.260917 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.341353 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.345675 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr9vd"] Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.647437 4759 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.648221 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.648472 4759 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.649018 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7" gracePeriod=15 Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.649103 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3" gracePeriod=15 Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.649174 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9" gracePeriod=15 Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.649185 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25" gracePeriod=15 Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.649552 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1" gracePeriod=15 Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.652894 4759 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653296 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653425 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653464 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653518 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653543 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653563 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653585 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653603 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653620 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653637 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653660 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653676 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 00:27:11 crc kubenswrapper[4759]: E1205 00:27:11.653712 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653728 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.653976 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.654010 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.654028 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.654053 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.654084 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.654640 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664164 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664265 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664321 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664350 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664372 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664388 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664420 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.664471 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765713 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765782 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765814 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765835 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765863 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765906 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765936 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.765984 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766062 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766107 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766137 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766188 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766329 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766385 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766378 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:11 crc kubenswrapper[4759]: I1205 00:27:11.766441 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.000197 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.001695 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.002511 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25" exitCode=0 Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.002545 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9" exitCode=0 Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.002561 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3" exitCode=0 Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.002574 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1" exitCode=2 Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.002621 4759 scope.go:117] "RemoveContainer" containerID="7b911398f38b7fcf132d0a90cb91c02e6eed1a9caf25664959e55856475aa6f8" Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.048494 4759 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.048567 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.330585 4759 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.330945 4759 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.331366 4759 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.331703 4759 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.331981 4759 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: I1205 00:27:12.332020 4759 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.332379 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="200ms" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.533293 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="400ms" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.583227 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a929531bb959f0b8fee26224ee1c20db089abfeca0140403ae1f0c3363ef71d1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f8716572be76ae0a4e79f51c5a917183459b6b2ceacbd574fe24b5a9c15805b1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1208070485},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ac32ea4cbca6dcedab2f1028fc75366162e84595a3ff0fb4192fbc17ee9ba797\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad35c4d0cb0084bc680e1f3125665dfcede9f6d899e97c97fbb4a2dc69a23aea\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201944900},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.583830 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.584182 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.584472 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.584690 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.584708 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:27:12 crc kubenswrapper[4759]: E1205 00:27:12.933884 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="800ms" Dec 05 00:27:13 crc kubenswrapper[4759]: I1205 00:27:13.165461 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b8f271-fcc3-4014-8a36-3e7019bef7c5" path="/var/lib/kubelet/pods/84b8f271-fcc3-4014-8a36-3e7019bef7c5/volumes" Dec 05 00:27:13 crc kubenswrapper[4759]: E1205 00:27:13.734806 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="1.6s" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.017224 4759 generic.go:334] "Generic (PLEG): container finished" podID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" containerID="5f73efcaa9aaceecce52a5f5e4db4e52894a0b02aa209cc3b6fac154e727e3c4" exitCode=0 Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.017349 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbabb1ea-7574-4d58-9a61-0982f4d1897a","Type":"ContainerDied","Data":"5f73efcaa9aaceecce52a5f5e4db4e52894a0b02aa209cc3b6fac154e727e3c4"} Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.018343 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.020151 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.520123 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.520812 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.521162 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.567522 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.568082 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:14 crc kubenswrapper[4759]: I1205 00:27:14.568446 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.028731 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.029704 4759 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7" exitCode=0 Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.117634 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.118855 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.119599 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.120033 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.120443 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.213712 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.214420 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.214450 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.214537 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.214536 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.214689 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.215037 4759 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.215055 4759 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.215068 4759 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.293485 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.293889 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.294113 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: E1205 00:27:15.335908 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="3.2s" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.417929 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access\") pod \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.417995 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir\") pod \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.418069 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbabb1ea-7574-4d58-9a61-0982f4d1897a" (UID: "dbabb1ea-7574-4d58-9a61-0982f4d1897a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.418131 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock\") pod \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\" (UID: \"dbabb1ea-7574-4d58-9a61-0982f4d1897a\") " Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.418223 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock" (OuterVolumeSpecName: "var-lock") pod "dbabb1ea-7574-4d58-9a61-0982f4d1897a" (UID: "dbabb1ea-7574-4d58-9a61-0982f4d1897a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.418579 4759 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.418603 4759 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.426063 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbabb1ea-7574-4d58-9a61-0982f4d1897a" (UID: "dbabb1ea-7574-4d58-9a61-0982f4d1897a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.519359 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbabb1ea-7574-4d58-9a61-0982f4d1897a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.553185 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.554071 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.554797 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.555467 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.600107 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.600839 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.601719 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.602275 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.656077 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.656924 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.657581 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.658139 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.658626 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.692958 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.693536 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.694038 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.694613 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:15 crc kubenswrapper[4759]: I1205 00:27:15.694880 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.039202 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dbabb1ea-7574-4d58-9a61-0982f4d1897a","Type":"ContainerDied","Data":"5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b"} Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.039282 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eed89ef304b520fe180547d90744868a19fe060ee87795e616860322ef32b5b" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.039236 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.043687 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.044933 4759 scope.go:117] "RemoveContainer" containerID="9e486c43d1b91cd04bd39ffc97d2e59eaae32f212549f4ad97851c7b2f6fca25" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.044967 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.046501 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.047002 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.047709 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.048126 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.048600 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.063054 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.063597 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.064022 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.064267 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.064294 4759 scope.go:117] "RemoveContainer" containerID="c86be97ec419ff1862fc666ce95964bf5bf9d55a35f229a63fcb5489e25e5ec9" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.064522 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.064902 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.065161 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.065521 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.065783 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.066301 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.083491 4759 scope.go:117] "RemoveContainer" containerID="cf8ee533639430c7fb0a80d22e8834515637c8c3cac6bd430a93aa73ba4260a3" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.095708 4759 scope.go:117] "RemoveContainer" containerID="e05c01d1c606d2d378ff0eb40bf678226431722c616bfecc546a3258c5256ed1" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.109424 4759 scope.go:117] "RemoveContainer" containerID="b8592f8ad9fe3e97f6af504e15a807aa52185e11b72ed6a86fec5d1490b98ec7" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.127566 4759 scope.go:117] "RemoveContainer" containerID="4f8177fd0a0a8bb2fd010d0670969676f3656895a85b680c8ceb47ea94631af5" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.336839 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.337819 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.338206 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.338831 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.339082 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.339274 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.339497 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.691889 4759 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.692608 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.703921 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.704944 4759 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.705382 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.705784 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.706183 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.706545 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.706907 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: I1205 00:27:16.707282 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:16 crc kubenswrapper[4759]: W1205 00:27:16.724084 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-39a45e73f8fd841b5e31ca408c1fb5b23404509627a8cd9de2a6789b695b4612 WatchSource:0}: Error finding container 39a45e73f8fd841b5e31ca408c1fb5b23404509627a8cd9de2a6789b695b4612: Status 404 returned error can't find the container with id 39a45e73f8fd841b5e31ca408c1fb5b23404509627a8cd9de2a6789b695b4612 Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.728501 4759 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.150:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e2a23af39d0f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 00:27:16.727836913 +0000 UTC m=+255.943497873,LastTimestamp:2025-12-05 00:27:16.727836913 +0000 UTC m=+255.943497873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.946914 4759 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 00:27:16 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79" Netns:"/var/run/netns/c52cf561-c15a-4364-892b-5d721b995448" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:16 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:16 crc kubenswrapper[4759]: > Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.947412 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 00:27:16 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79" Netns:"/var/run/netns/c52cf561-c15a-4364-892b-5d721b995448" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:16 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:16 crc kubenswrapper[4759]: > pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.947437 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 05 00:27:16 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79" Netns:"/var/run/netns/c52cf561-c15a-4364-892b-5d721b995448" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:16 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:16 crc kubenswrapper[4759]: > pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:16 crc kubenswrapper[4759]: E1205 00:27:16.947518 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79\\\" Netns:\\\"/var/run/netns/c52cf561-c15a-4364-892b-5d721b995448\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=4921d056893d2f25f2a810ee4507e32f19f0a683b5f8d83503c3652077589d79;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s\\\": dial tcp 38.102.83.150:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.054873 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"39a45e73f8fd841b5e31ca408c1fb5b23404509627a8cd9de2a6789b695b4612"} Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.054894 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.055454 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.164389 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.490953 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.492005 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.492415 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.492797 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.493055 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.493215 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.493369 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.493504 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.528916 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.529582 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.530054 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.530264 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.530492 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.530690 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.530926 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.531283 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: E1205 00:27:17.673470 4759 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 00:27:17 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba" Netns:"/var/run/netns/88d878e8-c574-468f-93ae-eb470a151830" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:17 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:17 crc kubenswrapper[4759]: > Dec 05 00:27:17 crc kubenswrapper[4759]: E1205 00:27:17.673534 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 00:27:17 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba" Netns:"/var/run/netns/88d878e8-c574-468f-93ae-eb470a151830" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:17 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:17 crc kubenswrapper[4759]: > pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:17 crc kubenswrapper[4759]: E1205 00:27:17.673554 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 05 00:27:17 crc kubenswrapper[4759]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba" Netns:"/var/run/netns/88d878e8-c574-468f-93ae-eb470a151830" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s": dial tcp 38.102.83.150:6443: connect: connection refused Dec 05 00:27:17 crc kubenswrapper[4759]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 00:27:17 crc kubenswrapper[4759]: > pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:17 crc kubenswrapper[4759]: E1205 00:27:17.673608 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication_d3054fff-52bd-437c-a203-aadefcb88d98_0(f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba): error adding pod openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba\\\" Netns:\\\"/var/run/netns/88d878e8-c574-468f-93ae-eb470a151830\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-74bc74d8d6-fv2r2;K8S_POD_INFRA_CONTAINER_ID=f99ff88852328bce5713ffec1d689513b7d8c410f6fc5f2c1d42e2af3d0032ba;K8S_POD_UID=d3054fff-52bd-437c-a203-aadefcb88d98\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2] networking: Multus: [openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2/d3054fff-52bd-437c-a203-aadefcb88d98]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-74bc74d8d6-fv2r2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-74bc74d8d6-fv2r2?timeout=1m0s\\\": dial tcp 38.102.83.150:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.862934 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.863785 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.864176 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.864593 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.864868 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.865162 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.865581 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.866003 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.866543 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.910107 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.910704 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.911039 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.911376 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.911719 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.912036 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.912377 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.912689 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:17 crc kubenswrapper[4759]: I1205 00:27:17.912990 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:18 crc kubenswrapper[4759]: E1205 00:27:18.157854 4759 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" volumeName="registry-storage" Dec 05 00:27:18 crc kubenswrapper[4759]: E1205 00:27:18.537262 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="6.4s" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.073735 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906"} Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.074596 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: E1205 00:27:20.074937 4759 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.074990 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.075505 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.075862 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.076397 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.077004 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.077551 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:20 crc kubenswrapper[4759]: I1205 00:27:20.077949 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: E1205 00:27:21.079273 4759 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.160716 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.161154 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.161658 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.161939 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.162363 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.162902 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.163210 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:21 crc kubenswrapper[4759]: I1205 00:27:21.163717 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.604190 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T00:27:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a929531bb959f0b8fee26224ee1c20db089abfeca0140403ae1f0c3363ef71d1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f8716572be76ae0a4e79f51c5a917183459b6b2ceacbd574fe24b5a9c15805b1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1208070485},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ac32ea4cbca6dcedab2f1028fc75366162e84595a3ff0fb4192fbc17ee9ba797\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad35c4d0cb0084bc680e1f3125665dfcede9f6d899e97c97fbb4a2dc69a23aea\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201944900},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.605141 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.605489 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.605628 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.605778 4759 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:22 crc kubenswrapper[4759]: E1205 00:27:22.605801 4759 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 00:27:23 crc kubenswrapper[4759]: E1205 00:27:23.544146 4759 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.150:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e2a23af39d0f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 00:27:16.727836913 +0000 UTC m=+255.943497873,LastTimestamp:2025-12-05 00:27:16.727836913 +0000 UTC m=+255.943497873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 00:27:24 crc kubenswrapper[4759]: E1205 00:27:24.939264 4759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.150:6443: connect: connection refused" interval="7s" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.112472 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.112531 4759 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9" exitCode=1 Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.112563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9"} Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.113094 4759 scope.go:117] "RemoveContainer" containerID="ff3865d487950c34948cffaa27c316b478484b94dc95aec86f5f93ea669a51a9" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.113783 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.114446 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.114930 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.115336 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.115667 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.116108 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.116716 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.117224 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.117809 4759 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.154908 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.155878 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.156221 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.156726 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.157503 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.158277 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.158748 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.159177 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.159607 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.160035 4759 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.221275 4759 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.221313 4759 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:26 crc kubenswrapper[4759]: E1205 00:27:26.221785 4759 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:26 crc kubenswrapper[4759]: I1205 00:27:26.222565 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.130347 4759 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="34ab303ecd35205bfd5956520aace5ffca1d79e1fd7f4093f3954c9ae6e2fe48" exitCode=0 Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.130466 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"34ab303ecd35205bfd5956520aace5ffca1d79e1fd7f4093f3954c9ae6e2fe48"} Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.130707 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"47b55dc9b922f93bb91a8cd241bc954d43f22c686b98bd8854c9a8435df439e9"} Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.130996 4759 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.131011 4759 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.132344 4759 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: E1205 00:27:27.132361 4759 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.132732 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.133012 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.133341 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.133632 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.134037 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.134365 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.134654 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.134918 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.138644 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.138716 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d0014c94105a5d6e1ab3b5da4fb3e3f10c8cf69872d952ffd395246c43f0029e"} Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.139447 4759 status_manager.go:851] "Failed to get status for pod" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" pod="openshift-marketplace/redhat-marketplace-9pqvs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9pqvs\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.139715 4759 status_manager.go:851] "Failed to get status for pod" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" pod="openshift-marketplace/certified-operators-lgc9q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lgc9q\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.140009 4759 status_manager.go:851] "Failed to get status for pod" podUID="7c605086-ba35-4534-8732-246afbfde953" pod="openshift-marketplace/redhat-marketplace-5s4sw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5s4sw\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.140272 4759 status_manager.go:851] "Failed to get status for pod" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.140692 4759 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.141131 4759 status_manager.go:851] "Failed to get status for pod" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" pod="openshift-marketplace/certified-operators-7fvzn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-7fvzn\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.141390 4759 status_manager.go:851] "Failed to get status for pod" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" pod="openshift-marketplace/community-operators-pb4c2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-pb4c2\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.141611 4759 status_manager.go:851] "Failed to get status for pod" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" pod="openshift-marketplace/redhat-operators-rtslc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-rtslc\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:27 crc kubenswrapper[4759]: I1205 00:27:27.141825 4759 status_manager.go:851] "Failed to get status for pod" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" pod="openshift-marketplace/redhat-operators-cmjn5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cmjn5\": dial tcp 38.102.83.150:6443: connect: connection refused" Dec 05 00:27:28 crc kubenswrapper[4759]: I1205 00:27:28.145986 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6a8c7ca18f57f5981dea1a714c080ae1c90a0849f4031aae79da2b5d1f8d6c58"} Dec 05 00:27:28 crc kubenswrapper[4759]: I1205 00:27:28.146262 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c40c1f2bdc6d21c77c19a221c406023d2daa4d8bd49aa83adfd8673c50931ab"} Dec 05 00:27:28 crc kubenswrapper[4759]: I1205 00:27:28.146276 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3164121b813ea3967859349f5fe7364001793e2b366a3254f52210cc1095ef8f"} Dec 05 00:27:28 crc kubenswrapper[4759]: I1205 00:27:28.146285 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a2aba30c2bfa36e86043ca862b8756845435a7f7f2871f6ebacf4bc15e26c78"} Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.155727 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.156329 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.175215 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"90a9bb299797cceaa60c4533d1fe21f9bf1226ec2d163a80e89b3daf2daa905e"} Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.175566 4759 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.175588 4759 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:29 crc kubenswrapper[4759]: I1205 00:27:29.175778 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:31 crc kubenswrapper[4759]: I1205 00:27:31.223617 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:31 crc kubenswrapper[4759]: I1205 00:27:31.223896 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:31 crc kubenswrapper[4759]: I1205 00:27:31.230149 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:33 crc kubenswrapper[4759]: I1205 00:27:33.167632 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:27:34 crc kubenswrapper[4759]: I1205 00:27:34.187990 4759 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:34 crc kubenswrapper[4759]: I1205 00:27:34.398249 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="37c47aee-0522-4f6e-93ac-108c78b2f14a" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.216463 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerStarted","Data":"ab1b077a4418b6b32d62c6be6fcef7092d22e7d6231e8a54f4968dd8a9b52471"} Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.218414 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerStarted","Data":"9e8f43d2c5f7682b3d4e40dc4d48d1cf5b8a91137ef984b8c78d9f25e350a438"} Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.216722 4759 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.218716 4759 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.219335 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.221412 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.221686 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="37c47aee-0522-4f6e-93ac-108c78b2f14a" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.503117 4759 patch_prober.go:28] interesting pod/oauth-openshift-74bc74d8d6-fv2r2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": read tcp 10.217.0.2:39232->10.217.0.57:6443: read: connection reset by peer" start-of-body= Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.503173 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": read tcp 10.217.0.2:39232->10.217.0.57:6443: read: connection reset by peer" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.905856 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:27:35 crc kubenswrapper[4759]: I1205 00:27:35.910742 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223121 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/0.log" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223186 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3054fff-52bd-437c-a203-aadefcb88d98" containerID="ab1b077a4418b6b32d62c6be6fcef7092d22e7d6231e8a54f4968dd8a9b52471" exitCode=255 Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223322 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerDied","Data":"ab1b077a4418b6b32d62c6be6fcef7092d22e7d6231e8a54f4968dd8a9b52471"} Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223658 4759 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223686 4759 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ef44e93e-b269-459c-b2ae-22a70267bc87" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.223758 4759 scope.go:117] "RemoveContainer" containerID="ab1b077a4418b6b32d62c6be6fcef7092d22e7d6231e8a54f4968dd8a9b52471" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.228811 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 00:27:36 crc kubenswrapper[4759]: I1205 00:27:36.272603 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="37c47aee-0522-4f6e-93ac-108c78b2f14a" Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.230592 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/1.log" Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.231375 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/0.log" Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.231422 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3054fff-52bd-437c-a203-aadefcb88d98" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" exitCode=255 Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.231535 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerDied","Data":"3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51"} Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.231573 4759 scope.go:117] "RemoveContainer" containerID="ab1b077a4418b6b32d62c6be6fcef7092d22e7d6231e8a54f4968dd8a9b52471" Dec 05 00:27:37 crc kubenswrapper[4759]: I1205 00:27:37.231936 4759 scope.go:117] "RemoveContainer" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" Dec 05 00:27:37 crc kubenswrapper[4759]: E1205 00:27:37.232145 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:38 crc kubenswrapper[4759]: I1205 00:27:38.242037 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/1.log" Dec 05 00:27:38 crc kubenswrapper[4759]: I1205 00:27:38.242525 4759 scope.go:117] "RemoveContainer" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" Dec 05 00:27:38 crc kubenswrapper[4759]: E1205 00:27:38.242702 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:41 crc kubenswrapper[4759]: I1205 00:27:41.261278 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:41 crc kubenswrapper[4759]: I1205 00:27:41.261656 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:41 crc kubenswrapper[4759]: I1205 00:27:41.262643 4759 scope.go:117] "RemoveContainer" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" Dec 05 00:27:41 crc kubenswrapper[4759]: E1205 00:27:41.263029 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:43 crc kubenswrapper[4759]: I1205 00:27:43.361374 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 00:27:43 crc kubenswrapper[4759]: I1205 00:27:43.421193 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 00:27:43 crc kubenswrapper[4759]: I1205 00:27:43.784661 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 00:27:43 crc kubenswrapper[4759]: I1205 00:27:43.812379 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 00:27:44 crc kubenswrapper[4759]: I1205 00:27:44.275718 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 00:27:44 crc kubenswrapper[4759]: I1205 00:27:44.580554 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 00:27:44 crc kubenswrapper[4759]: I1205 00:27:44.820176 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.158097 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.229274 4759 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.300958 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.409886 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.411448 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.493826 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.670965 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.747058 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.941432 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.946490 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 00:27:45 crc kubenswrapper[4759]: I1205 00:27:45.988295 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.108911 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.167158 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.202287 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.422367 4759 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.518110 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.569490 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.617585 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.638996 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.796419 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 00:27:46 crc kubenswrapper[4759]: I1205 00:27:46.928683 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.158179 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.158488 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.162564 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.236153 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.239738 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.267984 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.273225 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.364559 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.390234 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.451860 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.475960 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.484563 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.748439 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.793746 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.836755 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.837735 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.865048 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 00:27:47 crc kubenswrapper[4759]: I1205 00:27:47.936763 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.037865 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.130400 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.135144 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.140406 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.141274 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.150120 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.150541 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.164576 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.310463 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.395994 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.435911 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.461511 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.503259 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.507168 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.527443 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.551964 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.713173 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.726908 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.867568 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.875497 4759 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.880166 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.880240 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.880264 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2"] Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.880929 4759 scope.go:117] "RemoveContainer" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.890408 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.899611 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.899594409 podStartE2EDuration="14.899594409s" podCreationTimestamp="2025-12-05 00:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:27:48.898209832 +0000 UTC m=+288.113870782" watchObservedRunningTime="2025-12-05 00:27:48.899594409 +0000 UTC m=+288.115255369" Dec 05 00:27:48 crc kubenswrapper[4759]: I1205 00:27:48.964809 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.009735 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.192035 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.224000 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.309229 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/1.log" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.309567 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerStarted","Data":"6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff"} Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.310675 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.311764 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.335084 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podStartSLOduration=66.33506924 podStartE2EDuration="1m6.33506924s" podCreationTimestamp="2025-12-05 00:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:27:35.246974047 +0000 UTC m=+274.462635017" watchObservedRunningTime="2025-12-05 00:27:49.33506924 +0000 UTC m=+288.550730190" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.451180 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.465185 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.510536 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.521778 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.643101 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.644648 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.672991 4759 patch_prober.go:28] interesting pod/oauth-openshift-74bc74d8d6-fv2r2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": read tcp 10.217.0.2:44188->10.217.0.57:6443: read: connection reset by peer" start-of-body= Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.673048 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": read tcp 10.217.0.2:44188->10.217.0.57:6443: read: connection reset by peer" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.710987 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.757650 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.757660 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.774636 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.818961 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.836281 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 00:27:49 crc kubenswrapper[4759]: I1205 00:27:49.914154 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.082293 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.279338 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.316265 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/2.log" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.316927 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/1.log" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.317091 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3054fff-52bd-437c-a203-aadefcb88d98" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" exitCode=255 Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.317146 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerDied","Data":"6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff"} Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.317293 4759 scope.go:117] "RemoveContainer" containerID="3f0fbdd1648428da5d982e5079c12378a3a1a54a2ea059cc94429c1f1691bf51" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.317643 4759 scope.go:117] "RemoveContainer" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" Dec 05 00:27:50 crc kubenswrapper[4759]: E1205 00:27:50.317939 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.320761 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.433495 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.486664 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.498558 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.541453 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.552113 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.585864 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.783578 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.810246 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.919160 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 00:27:50 crc kubenswrapper[4759]: I1205 00:27:50.971393 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.006782 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.020463 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.020979 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.030627 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.067501 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.082001 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.154126 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.165258 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.166390 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.261531 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.281085 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.329625 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/2.log" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.330460 4759 scope.go:117] "RemoveContainer" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" Dec 05 00:27:51 crc kubenswrapper[4759]: E1205 00:27:51.330770 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.473177 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.540181 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.595784 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.686719 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.692912 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.697264 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.742924 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.780489 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.794187 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.967239 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 00:27:51 crc kubenswrapper[4759]: I1205 00:27:51.986074 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.086506 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.094279 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.151297 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.187245 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.250574 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.274065 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.336669 4759 scope.go:117] "RemoveContainer" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" Dec 05 00:27:52 crc kubenswrapper[4759]: E1205 00:27:52.337166 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.373231 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.407748 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.410883 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.434767 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.688060 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.718909 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.719229 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.762273 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.767920 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.807746 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.821980 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.839993 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.852098 4759 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.897333 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 00:27:52 crc kubenswrapper[4759]: I1205 00:27:52.937691 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.067066 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.128966 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.193486 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.202681 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.284354 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.370093 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.382437 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.410882 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.542864 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.567197 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.633843 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.649768 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.664381 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.683760 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.688126 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.841287 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.907044 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.911002 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.918633 4759 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 00:27:53 crc kubenswrapper[4759]: I1205 00:27:53.975339 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.014123 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.035827 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.067501 4759 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.137815 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.234792 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.308065 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.331268 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.376137 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.419089 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.485378 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.501905 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.532242 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.562277 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.695267 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.696590 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.736693 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.935881 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.936204 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.956633 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 00:27:54 crc kubenswrapper[4759]: I1205 00:27:54.995021 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.006645 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.007807 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.132023 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.148272 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.225609 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.249820 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.356178 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.410129 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.516366 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.518248 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.549483 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.550043 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.552711 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.648975 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.679562 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.694250 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.777505 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.781326 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.852657 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.880367 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.943221 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.964789 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 00:27:55 crc kubenswrapper[4759]: I1205 00:27:55.989991 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.041194 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.055231 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.117552 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.129428 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.172085 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.404996 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.509957 4759 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.510355 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906" gracePeriod=5 Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.520231 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.589071 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.684596 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.726369 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.750710 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.806988 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.944433 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.985362 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 00:27:56 crc kubenswrapper[4759]: I1205 00:27:56.993498 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.051770 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.077759 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.133407 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.141474 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.431739 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.459926 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.503502 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.572268 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.595474 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.618075 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.733606 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.736529 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 00:27:57 crc kubenswrapper[4759]: I1205 00:27:57.883092 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.244604 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.407787 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.446987 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.451818 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.459647 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.556571 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.557778 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.569876 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.592584 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.812780 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.849814 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 00:27:58 crc kubenswrapper[4759]: I1205 00:27:58.864568 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 00:27:59 crc kubenswrapper[4759]: I1205 00:27:59.016501 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 00:27:59 crc kubenswrapper[4759]: I1205 00:27:59.491119 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 00:27:59 crc kubenswrapper[4759]: I1205 00:27:59.760545 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 00:28:00 crc kubenswrapper[4759]: I1205 00:28:00.068344 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 00:28:01 crc kubenswrapper[4759]: I1205 00:28:01.472637 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.095129 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.095225 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.230758 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.230816 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.230858 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.230972 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231004 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231002 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231057 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231087 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231239 4759 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231227 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231384 4759 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.231422 4759 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.240971 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.334663 4759 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.334726 4759 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.391926 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.392333 4759 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906" exitCode=137 Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.392381 4759 scope.go:117] "RemoveContainer" containerID="aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.392415 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.408694 4759 scope.go:117] "RemoveContainer" containerID="aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906" Dec 05 00:28:02 crc kubenswrapper[4759]: E1205 00:28:02.409728 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906\": container with ID starting with aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906 not found: ID does not exist" containerID="aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906" Dec 05 00:28:02 crc kubenswrapper[4759]: I1205 00:28:02.409760 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906"} err="failed to get container status \"aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906\": rpc error: code = NotFound desc = could not find container \"aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906\": container with ID starting with aeca7a699adf0a25903a3b26562730c74bb741b13d95be6c5cf78fdaa209c906 not found: ID does not exist" Dec 05 00:28:03 crc kubenswrapper[4759]: I1205 00:28:03.156091 4759 scope.go:117] "RemoveContainer" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" Dec 05 00:28:03 crc kubenswrapper[4759]: E1205 00:28:03.156298 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-74bc74d8d6-fv2r2_openshift-authentication(d3054fff-52bd-437c-a203-aadefcb88d98)\"" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" podUID="d3054fff-52bd-437c-a203-aadefcb88d98" Dec 05 00:28:03 crc kubenswrapper[4759]: I1205 00:28:03.164351 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 00:28:16 crc kubenswrapper[4759]: I1205 00:28:16.140583 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 00:28:17 crc kubenswrapper[4759]: I1205 00:28:17.155879 4759 scope.go:117] "RemoveContainer" containerID="6c78d96d5b5560138c843ee5c56e10626c20f198875e9a103008e9a2a6771cff" Dec 05 00:28:17 crc kubenswrapper[4759]: I1205 00:28:17.492389 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/2.log" Dec 05 00:28:17 crc kubenswrapper[4759]: I1205 00:28:17.759445 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 00:28:18 crc kubenswrapper[4759]: I1205 00:28:18.501670 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-74bc74d8d6-fv2r2_d3054fff-52bd-437c-a203-aadefcb88d98/oauth-openshift/2.log" Dec 05 00:28:18 crc kubenswrapper[4759]: I1205 00:28:18.502178 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" event={"ID":"d3054fff-52bd-437c-a203-aadefcb88d98","Type":"ContainerStarted","Data":"8f6f0aff0ce54e9dc6d9b11e0f7a447da3158abbb296f47685efe333d3f934e5"} Dec 05 00:28:18 crc kubenswrapper[4759]: I1205 00:28:18.502651 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:28:18 crc kubenswrapper[4759]: I1205 00:28:18.509221 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-74bc74d8d6-fv2r2" Dec 05 00:28:21 crc kubenswrapper[4759]: I1205 00:28:21.249550 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 00:28:22 crc kubenswrapper[4759]: I1205 00:28:22.527821 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 00:28:24 crc kubenswrapper[4759]: I1205 00:28:24.970705 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:28:24 crc kubenswrapper[4759]: I1205 00:28:24.971219 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" containerID="cri-o://d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17" gracePeriod=30 Dec 05 00:28:25 crc kubenswrapper[4759]: I1205 00:28:25.070077 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:28:25 crc kubenswrapper[4759]: I1205 00:28:25.070571 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" containerID="cri-o://5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7" gracePeriod=30 Dec 05 00:28:25 crc kubenswrapper[4759]: E1205 00:28:25.074695 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae425cf_9dc3_471e_a2b8_506eedb29c8d.slice/crio-d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:28:26 crc kubenswrapper[4759]: I1205 00:28:26.561009 4759 generic.go:334] "Generic (PLEG): container finished" podID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerID="d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17" exitCode=0 Dec 05 00:28:26 crc kubenswrapper[4759]: I1205 00:28:26.561072 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" event={"ID":"0ae425cf-9dc3-471e-a2b8-506eedb29c8d","Type":"ContainerDied","Data":"d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17"} Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.283751 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.289916 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349088 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:28:27 crc kubenswrapper[4759]: E1205 00:28:27.349286 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349297 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: E1205 00:28:27.349328 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349339 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 00:28:27 crc kubenswrapper[4759]: E1205 00:28:27.349346 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349352 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: E1205 00:28:27.349371 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" containerName="installer" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349377 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" containerName="installer" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349457 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349466 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" containerName="controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349476 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerName="route-controller-manager" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349484 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbabb1ea-7574-4d58-9a61-0982f4d1897a" containerName="installer" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.349803 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.372606 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config\") pod \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.372663 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") pod \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.372916 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.372958 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.372996 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.373295 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.373594 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbz9p\" (UniqueName: \"kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.375584 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config" (OuterVolumeSpecName: "config") pod "0d64b1f1-f632-4956-a9d8-83703ef96ca1" (UID: "0d64b1f1-f632-4956-a9d8-83703ef96ca1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.386184 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0ae425cf-9dc3-471e-a2b8-506eedb29c8d" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.386988 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.475476 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76k4t\" (UniqueName: \"kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t\") pod \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.475937 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config\") pod \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.476764 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config" (OuterVolumeSpecName: "config") pod "0ae425cf-9dc3-471e-a2b8-506eedb29c8d" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.476991 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca\") pod \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.477033 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles\") pod \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.477573 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0ae425cf-9dc3-471e-a2b8-506eedb29c8d" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.477753 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca\") pod \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\" (UID: \"0ae425cf-9dc3-471e-a2b8-506eedb29c8d\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.478167 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "0ae425cf-9dc3-471e-a2b8-506eedb29c8d" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.478150 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d64b1f1-f632-4956-a9d8-83703ef96ca1" (UID: "0d64b1f1-f632-4956-a9d8-83703ef96ca1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.477790 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert\") pod \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.478610 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bgw\" (UniqueName: \"kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw\") pod \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\" (UID: \"0d64b1f1-f632-4956-a9d8-83703ef96ca1\") " Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.479015 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.479088 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbz9p\" (UniqueName: \"kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.479884 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.479952 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.480021 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.480539 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.480565 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.480577 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.480768 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t" (OuterVolumeSpecName: "kube-api-access-76k4t") pod "0ae425cf-9dc3-471e-a2b8-506eedb29c8d" (UID: "0ae425cf-9dc3-471e-a2b8-506eedb29c8d"). InnerVolumeSpecName "kube-api-access-76k4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.481193 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.481623 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.481839 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d64b1f1-f632-4956-a9d8-83703ef96ca1" (UID: "0d64b1f1-f632-4956-a9d8-83703ef96ca1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.481996 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.482057 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d64b1f1-f632-4956-a9d8-83703ef96ca1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.482070 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.482082 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.482601 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw" (OuterVolumeSpecName: "kube-api-access-l5bgw") pod "0d64b1f1-f632-4956-a9d8-83703ef96ca1" (UID: "0d64b1f1-f632-4956-a9d8-83703ef96ca1"). InnerVolumeSpecName "kube-api-access-l5bgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.485171 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.498258 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbz9p\" (UniqueName: \"kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p\") pod \"controller-manager-986d5957d-jgvm6\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.569088 4759 generic.go:334] "Generic (PLEG): container finished" podID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" containerID="5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7" exitCode=0 Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.569170 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" event={"ID":"0d64b1f1-f632-4956-a9d8-83703ef96ca1","Type":"ContainerDied","Data":"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7"} Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.569201 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" event={"ID":"0d64b1f1-f632-4956-a9d8-83703ef96ca1","Type":"ContainerDied","Data":"ad8e38cbfbac8dd01bd7223f364bc86e030c7c0ddada8b67eec0421e28b3468c"} Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.569221 4759 scope.go:117] "RemoveContainer" containerID="5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.569238 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.572562 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" event={"ID":"0ae425cf-9dc3-471e-a2b8-506eedb29c8d","Type":"ContainerDied","Data":"837c4eb1c40cd901e186850f9c548a4e6d158062ef39fb804b6185535a1888c9"} Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.572705 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7hqjc" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.582926 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d64b1f1-f632-4956-a9d8-83703ef96ca1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.582974 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bgw\" (UniqueName: \"kubernetes.io/projected/0d64b1f1-f632-4956-a9d8-83703ef96ca1-kube-api-access-l5bgw\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.582986 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76k4t\" (UniqueName: \"kubernetes.io/projected/0ae425cf-9dc3-471e-a2b8-506eedb29c8d-kube-api-access-76k4t\") on node \"crc\" DevicePath \"\"" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.588790 4759 scope.go:117] "RemoveContainer" containerID="5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7" Dec 05 00:28:27 crc kubenswrapper[4759]: E1205 00:28:27.589294 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7\": container with ID starting with 5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7 not found: ID does not exist" containerID="5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.589366 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7"} err="failed to get container status \"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7\": rpc error: code = NotFound desc = could not find container \"5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7\": container with ID starting with 5d1324dc64a6a628ade5ddbf5a73a5a850ea18f582064cfe2be048086ca367a7 not found: ID does not exist" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.589399 4759 scope.go:117] "RemoveContainer" containerID="d3d08e9107983403d042f6b2fd6c4b1a5eccfa75dc681dd37045013df5abce17" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.602439 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.602898 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-p6l5s"] Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.642535 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.645948 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7hqjc"] Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.667708 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:27 crc kubenswrapper[4759]: I1205 00:28:27.857196 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:28:28 crc kubenswrapper[4759]: I1205 00:28:28.581254 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" event={"ID":"c924045b-eac9-49b6-ab25-c256dd04fa74","Type":"ContainerStarted","Data":"3cf22669fe6f774c404eaaae5c8853d884b4cf32e702bb8cb7339bdd1ad3c1ac"} Dec 05 00:28:28 crc kubenswrapper[4759]: I1205 00:28:28.581664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" event={"ID":"c924045b-eac9-49b6-ab25-c256dd04fa74","Type":"ContainerStarted","Data":"24fe584c425caeff91db016bb8422b4493e88169e2983fc8430951e5c7f422db"} Dec 05 00:28:28 crc kubenswrapper[4759]: I1205 00:28:28.581690 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:28 crc kubenswrapper[4759]: I1205 00:28:28.586614 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:28:28 crc kubenswrapper[4759]: I1205 00:28:28.603608 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" podStartSLOduration=3.60359074 podStartE2EDuration="3.60359074s" podCreationTimestamp="2025-12-05 00:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:28:28.600147043 +0000 UTC m=+327.815807993" watchObservedRunningTime="2025-12-05 00:28:28.60359074 +0000 UTC m=+327.819251690" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.161962 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae425cf-9dc3-471e-a2b8-506eedb29c8d" path="/var/lib/kubelet/pods/0ae425cf-9dc3-471e-a2b8-506eedb29c8d/volumes" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.162519 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d64b1f1-f632-4956-a9d8-83703ef96ca1" path="/var/lib/kubelet/pods/0d64b1f1-f632-4956-a9d8-83703ef96ca1/volumes" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.371024 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.371786 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.374146 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.374183 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.374196 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.374598 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.375143 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.381491 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.390196 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.408017 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqr2\" (UniqueName: \"kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.408117 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.408361 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.408596 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.509859 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.509917 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqr2\" (UniqueName: \"kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.509963 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.509985 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.511261 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.511280 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.516993 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.532598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqr2\" (UniqueName: \"kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2\") pod \"route-controller-manager-599c9649f6-57bkq\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:29 crc kubenswrapper[4759]: I1205 00:28:29.686769 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:30 crc kubenswrapper[4759]: I1205 00:28:30.209160 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:28:30 crc kubenswrapper[4759]: I1205 00:28:30.592805 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" event={"ID":"cba371bf-29fe-493f-b0a8-1616619e0f08","Type":"ContainerStarted","Data":"3b6d9c0d3f929aa392d719af98b5131ed220ec793fa9deb7120c97298b127514"} Dec 05 00:28:31 crc kubenswrapper[4759]: I1205 00:28:31.599918 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" event={"ID":"cba371bf-29fe-493f-b0a8-1616619e0f08","Type":"ContainerStarted","Data":"631a340cfc16741fd63583eb7e88ccd6a8c79fb9b77de7331a802936caff95b1"} Dec 05 00:28:32 crc kubenswrapper[4759]: I1205 00:28:32.605241 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:32 crc kubenswrapper[4759]: I1205 00:28:32.610858 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:28:32 crc kubenswrapper[4759]: I1205 00:28:32.637939 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" podStartSLOduration=7.6379158700000005 podStartE2EDuration="7.63791587s" podCreationTimestamp="2025-12-05 00:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:28:32.623580472 +0000 UTC m=+331.839241432" watchObservedRunningTime="2025-12-05 00:28:32.63791587 +0000 UTC m=+331.853576820" Dec 05 00:28:34 crc kubenswrapper[4759]: I1205 00:28:34.433292 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:28:34 crc kubenswrapper[4759]: I1205 00:28:34.433726 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:28:34 crc kubenswrapper[4759]: I1205 00:28:34.959138 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 00:29:02 crc kubenswrapper[4759]: I1205 00:29:02.291633 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:29:02 crc kubenswrapper[4759]: I1205 00:29:02.292512 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7fvzn" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="registry-server" containerID="cri-o://a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a" gracePeriod=2 Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.001099 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.096014 4759 generic.go:334] "Generic (PLEG): container finished" podID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerID="a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a" exitCode=0 Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.096064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerDied","Data":"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a"} Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.096093 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7fvzn" event={"ID":"52b6eecd-9d85-46ac-9163-b04da27c2a2c","Type":"ContainerDied","Data":"59210c6354afa7ca9987901e0d333d8e7a66b76c7b53d02820e57bd0549ebc71"} Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.096085 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7fvzn" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.096112 4759 scope.go:117] "RemoveContainer" containerID="a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.118240 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content\") pod \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.119371 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities\") pod \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.119521 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2w4d\" (UniqueName: \"kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d\") pod \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\" (UID: \"52b6eecd-9d85-46ac-9163-b04da27c2a2c\") " Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.120236 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities" (OuterVolumeSpecName: "utilities") pod "52b6eecd-9d85-46ac-9163-b04da27c2a2c" (UID: "52b6eecd-9d85-46ac-9163-b04da27c2a2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.120830 4759 scope.go:117] "RemoveContainer" containerID="8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.127184 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d" (OuterVolumeSpecName: "kube-api-access-m2w4d") pod "52b6eecd-9d85-46ac-9163-b04da27c2a2c" (UID: "52b6eecd-9d85-46ac-9163-b04da27c2a2c"). InnerVolumeSpecName "kube-api-access-m2w4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.168193 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b6eecd-9d85-46ac-9163-b04da27c2a2c" (UID: "52b6eecd-9d85-46ac-9163-b04da27c2a2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.168510 4759 scope.go:117] "RemoveContainer" containerID="f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.190656 4759 scope.go:117] "RemoveContainer" containerID="a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a" Dec 05 00:29:04 crc kubenswrapper[4759]: E1205 00:29:04.191292 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a\": container with ID starting with a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a not found: ID does not exist" containerID="a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.191405 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a"} err="failed to get container status \"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a\": rpc error: code = NotFound desc = could not find container \"a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a\": container with ID starting with a28013070607e41c4f0c8e6eb79b6429c9c1bb5d56b20540b73086f2206afc1a not found: ID does not exist" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.191444 4759 scope.go:117] "RemoveContainer" containerID="8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223" Dec 05 00:29:04 crc kubenswrapper[4759]: E1205 00:29:04.191804 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223\": container with ID starting with 8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223 not found: ID does not exist" containerID="8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.191847 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223"} err="failed to get container status \"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223\": rpc error: code = NotFound desc = could not find container \"8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223\": container with ID starting with 8aca36bfd75bdd9f093843d527de2a77af57d91d1a93aeaa0a764360b8f22223 not found: ID does not exist" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.191882 4759 scope.go:117] "RemoveContainer" containerID="f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f" Dec 05 00:29:04 crc kubenswrapper[4759]: E1205 00:29:04.192192 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f\": container with ID starting with f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f not found: ID does not exist" containerID="f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.192241 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f"} err="failed to get container status \"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f\": rpc error: code = NotFound desc = could not find container \"f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f\": container with ID starting with f47311d4b00f6a698d6ad16d0bd47d35aec42698c90d7ecbcd94899beb547a9f not found: ID does not exist" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.221435 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.221484 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b6eecd-9d85-46ac-9163-b04da27c2a2c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.221500 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2w4d\" (UniqueName: \"kubernetes.io/projected/52b6eecd-9d85-46ac-9163-b04da27c2a2c-kube-api-access-m2w4d\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.433508 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.433565 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.436621 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.443170 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7fvzn"] Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.691765 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:29:04 crc kubenswrapper[4759]: I1205 00:29:04.692061 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9pqvs" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="registry-server" containerID="cri-o://744b2304556a356aa68ab914b5e2ec67b2747f6ef2587b67fe6f5050f364718a" gracePeriod=2 Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.105653 4759 generic.go:334] "Generic (PLEG): container finished" podID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerID="744b2304556a356aa68ab914b5e2ec67b2747f6ef2587b67fe6f5050f364718a" exitCode=0 Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.105769 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerDied","Data":"744b2304556a356aa68ab914b5e2ec67b2747f6ef2587b67fe6f5050f364718a"} Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.162367 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" path="/var/lib/kubelet/pods/52b6eecd-9d85-46ac-9163-b04da27c2a2c/volumes" Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.292964 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.293246 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cmjn5" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="registry-server" containerID="cri-o://8eb710201706ebcda5527009cc00f674d34c3e06bd04a031b6bdadff1cbe0277" gracePeriod=2 Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.866143 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.946967 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities\") pod \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.947066 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mkk8\" (UniqueName: \"kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8\") pod \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.947124 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content\") pod \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\" (UID: \"628054fc-dfa7-402e-8bd0-d56eed57b9fe\") " Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.952848 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities" (OuterVolumeSpecName: "utilities") pod "628054fc-dfa7-402e-8bd0-d56eed57b9fe" (UID: "628054fc-dfa7-402e-8bd0-d56eed57b9fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.955620 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8" (OuterVolumeSpecName: "kube-api-access-6mkk8") pod "628054fc-dfa7-402e-8bd0-d56eed57b9fe" (UID: "628054fc-dfa7-402e-8bd0-d56eed57b9fe"). InnerVolumeSpecName "kube-api-access-6mkk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:05 crc kubenswrapper[4759]: I1205 00:29:05.981681 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "628054fc-dfa7-402e-8bd0-d56eed57b9fe" (UID: "628054fc-dfa7-402e-8bd0-d56eed57b9fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.048911 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mkk8\" (UniqueName: \"kubernetes.io/projected/628054fc-dfa7-402e-8bd0-d56eed57b9fe-kube-api-access-6mkk8\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.048954 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.048967 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628054fc-dfa7-402e-8bd0-d56eed57b9fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.115096 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9pqvs" event={"ID":"628054fc-dfa7-402e-8bd0-d56eed57b9fe","Type":"ContainerDied","Data":"5e0c1dd225b0bfde680bb78b8181daae73a07235ab6a3eb3ab9506017a8ad278"} Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.115166 4759 scope.go:117] "RemoveContainer" containerID="744b2304556a356aa68ab914b5e2ec67b2747f6ef2587b67fe6f5050f364718a" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.115197 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9pqvs" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.131972 4759 scope.go:117] "RemoveContainer" containerID="862c7853b4effc5d135af4af71f2bcb2b199ce067d146a675396382ff975ba1b" Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.150023 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.158385 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9pqvs"] Dec 05 00:29:06 crc kubenswrapper[4759]: I1205 00:29:06.159820 4759 scope.go:117] "RemoveContainer" containerID="782b740eca4b92c2388ce63329b8db2c129b7c90d383955959e7af5372b7be43" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.122716 4759 generic.go:334] "Generic (PLEG): container finished" podID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerID="8eb710201706ebcda5527009cc00f674d34c3e06bd04a031b6bdadff1cbe0277" exitCode=0 Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.122783 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerDied","Data":"8eb710201706ebcda5527009cc00f674d34c3e06bd04a031b6bdadff1cbe0277"} Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.163407 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" path="/var/lib/kubelet/pods/628054fc-dfa7-402e-8bd0-d56eed57b9fe/volumes" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.583820 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.679767 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hgfw\" (UniqueName: \"kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw\") pod \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.680047 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content\") pod \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.680119 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities\") pod \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\" (UID: \"1dc5ee97-3aec-41e6-be6e-c479b3038dd6\") " Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.680985 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities" (OuterVolumeSpecName: "utilities") pod "1dc5ee97-3aec-41e6-be6e-c479b3038dd6" (UID: "1dc5ee97-3aec-41e6-be6e-c479b3038dd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.685495 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw" (OuterVolumeSpecName: "kube-api-access-9hgfw") pod "1dc5ee97-3aec-41e6-be6e-c479b3038dd6" (UID: "1dc5ee97-3aec-41e6-be6e-c479b3038dd6"). InnerVolumeSpecName "kube-api-access-9hgfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.781223 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.781259 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hgfw\" (UniqueName: \"kubernetes.io/projected/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-kube-api-access-9hgfw\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.797720 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dc5ee97-3aec-41e6-be6e-c479b3038dd6" (UID: "1dc5ee97-3aec-41e6-be6e-c479b3038dd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:07 crc kubenswrapper[4759]: I1205 00:29:07.882078 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dc5ee97-3aec-41e6-be6e-c479b3038dd6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.131011 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjn5" event={"ID":"1dc5ee97-3aec-41e6-be6e-c479b3038dd6","Type":"ContainerDied","Data":"4cd819871cdb413a1b25d865c2c97a14ff03280f14f35cc94019feb2cf48aa6f"} Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.131068 4759 scope.go:117] "RemoveContainer" containerID="8eb710201706ebcda5527009cc00f674d34c3e06bd04a031b6bdadff1cbe0277" Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.131099 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjn5" Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.173147 4759 scope.go:117] "RemoveContainer" containerID="a7681ed73a22689b5ac6ffecd793d6da2449f9ca89350c6e55ece69345378756" Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.178242 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.185975 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cmjn5"] Dec 05 00:29:08 crc kubenswrapper[4759]: I1205 00:29:08.208351 4759 scope.go:117] "RemoveContainer" containerID="b6a3ba4d0cbd3d065c943a377d75528ca33c9d257cfb2da4687bcbdb497154ff" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.163036 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" path="/var/lib/kubelet/pods/1dc5ee97-3aec-41e6-be6e-c479b3038dd6/volumes" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.519299 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.519605 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lgc9q" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="registry-server" containerID="cri-o://87a1a8dba60dfff88e376723c7cfcaec94c55efd5890d53b7303c6b5cd4dfbad" gracePeriod=30 Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.533058 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.533364 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pb4c2" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="registry-server" containerID="cri-o://d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969" gracePeriod=30 Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.538372 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.538633 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" containerID="cri-o://5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770" gracePeriod=30 Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.556326 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.556668 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5s4sw" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="registry-server" containerID="cri-o://fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158" gracePeriod=30 Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.560039 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7tt2"] Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.560286 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566524 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566626 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566642 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566649 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566658 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566668 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566674 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566683 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566689 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566701 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566706 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566716 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566723 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566737 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566744 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="extract-content" Dec 05 00:29:09 crc kubenswrapper[4759]: E1205 00:29:09.566752 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566758 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="extract-utilities" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566926 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5ee97-3aec-41e6-be6e-c479b3038dd6" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566938 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="628054fc-dfa7-402e-8bd0-d56eed57b9fe" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.566952 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b6eecd-9d85-46ac-9163-b04da27c2a2c" containerName="registry-server" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.567511 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.571406 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.571764 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rtslc" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="registry-server" containerID="cri-o://9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a" gracePeriod=30 Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.582763 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7tt2"] Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.704525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.704825 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6f9\" (UniqueName: \"kubernetes.io/projected/423b5d0f-1418-420b-80ca-f05d0087c85e-kube-api-access-7l6f9\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.704876 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.805694 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6f9\" (UniqueName: \"kubernetes.io/projected/423b5d0f-1418-420b-80ca-f05d0087c85e-kube-api-access-7l6f9\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.805778 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.805839 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.807464 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.813799 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/423b5d0f-1418-420b-80ca-f05d0087c85e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.832478 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6f9\" (UniqueName: \"kubernetes.io/projected/423b5d0f-1418-420b-80ca-f05d0087c85e-kube-api-access-7l6f9\") pod \"marketplace-operator-79b997595-n7tt2\" (UID: \"423b5d0f-1418-420b-80ca-f05d0087c85e\") " pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:09 crc kubenswrapper[4759]: I1205 00:29:09.887026 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.030030 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.042614 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.077445 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.083734 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.113541 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn7m4\" (UniqueName: \"kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4\") pod \"031753f7-0b97-45ec-8e24-a6aeafb09d65\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.114718 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s6sm\" (UniqueName: \"kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm\") pod \"a6019120-bf7b-47df-9e54-c7761066eb48\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.114827 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca\") pod \"a6019120-bf7b-47df-9e54-c7761066eb48\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.114855 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics\") pod \"a6019120-bf7b-47df-9e54-c7761066eb48\" (UID: \"a6019120-bf7b-47df-9e54-c7761066eb48\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.114883 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities\") pod \"031753f7-0b97-45ec-8e24-a6aeafb09d65\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.114911 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content\") pod \"031753f7-0b97-45ec-8e24-a6aeafb09d65\" (UID: \"031753f7-0b97-45ec-8e24-a6aeafb09d65\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.124342 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities" (OuterVolumeSpecName: "utilities") pod "031753f7-0b97-45ec-8e24-a6aeafb09d65" (UID: "031753f7-0b97-45ec-8e24-a6aeafb09d65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.125571 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a6019120-bf7b-47df-9e54-c7761066eb48" (UID: "a6019120-bf7b-47df-9e54-c7761066eb48"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.125831 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4" (OuterVolumeSpecName: "kube-api-access-vn7m4") pod "031753f7-0b97-45ec-8e24-a6aeafb09d65" (UID: "031753f7-0b97-45ec-8e24-a6aeafb09d65"). InnerVolumeSpecName "kube-api-access-vn7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.126202 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm" (OuterVolumeSpecName: "kube-api-access-8s6sm") pod "a6019120-bf7b-47df-9e54-c7761066eb48" (UID: "a6019120-bf7b-47df-9e54-c7761066eb48"). InnerVolumeSpecName "kube-api-access-8s6sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.136155 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a6019120-bf7b-47df-9e54-c7761066eb48" (UID: "a6019120-bf7b-47df-9e54-c7761066eb48"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.150978 4759 generic.go:334] "Generic (PLEG): container finished" podID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerID="d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969" exitCode=0 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.151038 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerDied","Data":"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.151064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb4c2" event={"ID":"031753f7-0b97-45ec-8e24-a6aeafb09d65","Type":"ContainerDied","Data":"a81117c2d0a0ecd8d41819212fe00552f64ebda0cc3937d72236714ea4ce3ed4"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.151081 4759 scope.go:117] "RemoveContainer" containerID="d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.151196 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb4c2" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.167201 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c605086-ba35-4534-8732-246afbfde953" containerID="fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158" exitCode=0 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.167274 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerDied","Data":"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.167318 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5s4sw" event={"ID":"7c605086-ba35-4534-8732-246afbfde953","Type":"ContainerDied","Data":"cbbd617213e1f19021717a58d190fa7c509935774ca593395e241cc0565a1359"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.167389 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5s4sw" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.172370 4759 generic.go:334] "Generic (PLEG): container finished" podID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerID="87a1a8dba60dfff88e376723c7cfcaec94c55efd5890d53b7303c6b5cd4dfbad" exitCode=0 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.172465 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerDied","Data":"87a1a8dba60dfff88e376723c7cfcaec94c55efd5890d53b7303c6b5cd4dfbad"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.174128 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "031753f7-0b97-45ec-8e24-a6aeafb09d65" (UID: "031753f7-0b97-45ec-8e24-a6aeafb09d65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.174477 4759 generic.go:334] "Generic (PLEG): container finished" podID="a6019120-bf7b-47df-9e54-c7761066eb48" containerID="5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770" exitCode=0 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.174557 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" event={"ID":"a6019120-bf7b-47df-9e54-c7761066eb48","Type":"ContainerDied","Data":"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.174595 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" event={"ID":"a6019120-bf7b-47df-9e54-c7761066eb48","Type":"ContainerDied","Data":"7c26b3bd442afedaced6769ce0f0908652e8c9b0eaa1151f60163aad8678f91a"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.174682 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vtnn" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.182830 4759 scope.go:117] "RemoveContainer" containerID="83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.185005 4759 generic.go:334] "Generic (PLEG): container finished" podID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerID="9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a" exitCode=0 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.185038 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerDied","Data":"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.185057 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtslc" event={"ID":"eac9e47c-1b1d-4b22-9040-3a198c5758fe","Type":"ContainerDied","Data":"d74885a36d2c936d7043a54f886923d6f5fa79570928e0f5bc1f69d9dfc60420"} Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.185109 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtslc" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.201765 4759 scope.go:117] "RemoveContainer" containerID="151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.216839 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities\") pod \"7c605086-ba35-4534-8732-246afbfde953\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.216908 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ngf7\" (UniqueName: \"kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7\") pod \"7c605086-ba35-4534-8732-246afbfde953\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.216946 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content\") pod \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.216965 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content\") pod \"7c605086-ba35-4534-8732-246afbfde953\" (UID: \"7c605086-ba35-4534-8732-246afbfde953\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.216987 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmz5j\" (UniqueName: \"kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j\") pod \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217063 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities\") pod \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\" (UID: \"eac9e47c-1b1d-4b22-9040-3a198c5758fe\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217337 4759 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217350 4759 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a6019120-bf7b-47df-9e54-c7761066eb48-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217360 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217369 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031753f7-0b97-45ec-8e24-a6aeafb09d65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217378 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn7m4\" (UniqueName: \"kubernetes.io/projected/031753f7-0b97-45ec-8e24-a6aeafb09d65-kube-api-access-vn7m4\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.217389 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s6sm\" (UniqueName: \"kubernetes.io/projected/a6019120-bf7b-47df-9e54-c7761066eb48-kube-api-access-8s6sm\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.218788 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities" (OuterVolumeSpecName: "utilities") pod "7c605086-ba35-4534-8732-246afbfde953" (UID: "7c605086-ba35-4534-8732-246afbfde953"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.219714 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities" (OuterVolumeSpecName: "utilities") pod "eac9e47c-1b1d-4b22-9040-3a198c5758fe" (UID: "eac9e47c-1b1d-4b22-9040-3a198c5758fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.221401 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7" (OuterVolumeSpecName: "kube-api-access-2ngf7") pod "7c605086-ba35-4534-8732-246afbfde953" (UID: "7c605086-ba35-4534-8732-246afbfde953"). InnerVolumeSpecName "kube-api-access-2ngf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.223379 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.225994 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j" (OuterVolumeSpecName: "kube-api-access-jmz5j") pod "eac9e47c-1b1d-4b22-9040-3a198c5758fe" (UID: "eac9e47c-1b1d-4b22-9040-3a198c5758fe"). InnerVolumeSpecName "kube-api-access-jmz5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.227523 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vtnn"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.235549 4759 scope.go:117] "RemoveContainer" containerID="d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.236530 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969\": container with ID starting with d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969 not found: ID does not exist" containerID="d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.236569 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969"} err="failed to get container status \"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969\": rpc error: code = NotFound desc = could not find container \"d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969\": container with ID starting with d63bc2f41404f452bc4b9f988108f9c1fefd173ef4b341708d322acf4c690969 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.236597 4759 scope.go:117] "RemoveContainer" containerID="83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.236998 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9\": container with ID starting with 83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9 not found: ID does not exist" containerID="83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.237117 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9"} err="failed to get container status \"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9\": rpc error: code = NotFound desc = could not find container \"83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9\": container with ID starting with 83157e58c3581f11ba92fb4e6ecb54079418523aa89e4012da041496e0b9a4b9 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.237218 4759 scope.go:117] "RemoveContainer" containerID="151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.239504 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c605086-ba35-4534-8732-246afbfde953" (UID: "7c605086-ba35-4534-8732-246afbfde953"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.242653 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9\": container with ID starting with 151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9 not found: ID does not exist" containerID="151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.242685 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9"} err="failed to get container status \"151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9\": rpc error: code = NotFound desc = could not find container \"151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9\": container with ID starting with 151901f066c5f5c11a3766e4895346c333f5f67787fc8b8db442fe91f60bf6c9 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.242711 4759 scope.go:117] "RemoveContainer" containerID="fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.258823 4759 scope.go:117] "RemoveContainer" containerID="f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.277366 4759 scope.go:117] "RemoveContainer" containerID="992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.297710 4759 scope.go:117] "RemoveContainer" containerID="fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.298460 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158\": container with ID starting with fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158 not found: ID does not exist" containerID="fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.298498 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158"} err="failed to get container status \"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158\": rpc error: code = NotFound desc = could not find container \"fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158\": container with ID starting with fa2c964eb25cbb7ce32f63a50598bd7f557fb6cafd790f3e4ee3ad7497d35158 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.298529 4759 scope.go:117] "RemoveContainer" containerID="f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.298969 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a\": container with ID starting with f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a not found: ID does not exist" containerID="f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.299044 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a"} err="failed to get container status \"f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a\": rpc error: code = NotFound desc = could not find container \"f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a\": container with ID starting with f228a4c1abfee6a272823a673b81c3bf05adb215340c945f30e33e3652e5351a not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.299086 4759 scope.go:117] "RemoveContainer" containerID="992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.299618 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b\": container with ID starting with 992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b not found: ID does not exist" containerID="992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.299643 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b"} err="failed to get container status \"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b\": rpc error: code = NotFound desc = could not find container \"992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b\": container with ID starting with 992062727f7f5675bd03c44791c3a47088237b92561ff0178a3c9ec5a576d75b not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.299659 4759 scope.go:117] "RemoveContainer" containerID="5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.312864 4759 scope.go:117] "RemoveContainer" containerID="5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.313232 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770\": container with ID starting with 5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770 not found: ID does not exist" containerID="5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.313279 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770"} err="failed to get container status \"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770\": rpc error: code = NotFound desc = could not find container \"5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770\": container with ID starting with 5b4e29f6aa750cb3ee9d21c2642302f828f78a19e251285bfab6550fcd945770 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.313337 4759 scope.go:117] "RemoveContainer" containerID="9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.318547 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmz5j\" (UniqueName: \"kubernetes.io/projected/eac9e47c-1b1d-4b22-9040-3a198c5758fe-kube-api-access-jmz5j\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.318575 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.318587 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.318599 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ngf7\" (UniqueName: \"kubernetes.io/projected/7c605086-ba35-4534-8732-246afbfde953-kube-api-access-2ngf7\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.318607 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c605086-ba35-4534-8732-246afbfde953-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.343120 4759 scope.go:117] "RemoveContainer" containerID="250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.347626 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eac9e47c-1b1d-4b22-9040-3a198c5758fe" (UID: "eac9e47c-1b1d-4b22-9040-3a198c5758fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.353428 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n7tt2"] Dec 05 00:29:10 crc kubenswrapper[4759]: W1205 00:29:10.364716 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423b5d0f_1418_420b_80ca_f05d0087c85e.slice/crio-fe89e5d9e5b54478124d1194e92d93a8de1c09c8a834f5a023f6abe8e9405291 WatchSource:0}: Error finding container fe89e5d9e5b54478124d1194e92d93a8de1c09c8a834f5a023f6abe8e9405291: Status 404 returned error can't find the container with id fe89e5d9e5b54478124d1194e92d93a8de1c09c8a834f5a023f6abe8e9405291 Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.367832 4759 scope.go:117] "RemoveContainer" containerID="b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.375799 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.389890 4759 scope.go:117] "RemoveContainer" containerID="9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.391037 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a\": container with ID starting with 9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a not found: ID does not exist" containerID="9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.391085 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a"} err="failed to get container status \"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a\": rpc error: code = NotFound desc = could not find container \"9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a\": container with ID starting with 9bbd9d2299b0da08fa9d13ff46ac15a031a863c3b2a053822998378182ae496a not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.391159 4759 scope.go:117] "RemoveContainer" containerID="250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.391547 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107\": container with ID starting with 250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107 not found: ID does not exist" containerID="250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.391629 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107"} err="failed to get container status \"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107\": rpc error: code = NotFound desc = could not find container \"250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107\": container with ID starting with 250e33b6ba2d6b5de876dd1b74adb3cbeb1d2ec374716eeb7914b4749c4cc107 not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.391648 4759 scope.go:117] "RemoveContainer" containerID="b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a" Dec 05 00:29:10 crc kubenswrapper[4759]: E1205 00:29:10.391911 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a\": container with ID starting with b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a not found: ID does not exist" containerID="b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.391965 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a"} err="failed to get container status \"b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a\": rpc error: code = NotFound desc = could not find container \"b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a\": container with ID starting with b6f9967db2dbf318591c01fe1326d9b74737c8aec18f5a52e59db4527e51eb8a not found: ID does not exist" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.420232 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac9e47c-1b1d-4b22-9040-3a198c5758fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.497453 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.500515 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pb4c2"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.507784 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.516257 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5s4sw"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.522405 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content\") pod \"39167ad7-8b39-4c2b-b783-88427c69b7eb\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.522504 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfqj\" (UniqueName: \"kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj\") pod \"39167ad7-8b39-4c2b-b783-88427c69b7eb\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.522540 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities\") pod \"39167ad7-8b39-4c2b-b783-88427c69b7eb\" (UID: \"39167ad7-8b39-4c2b-b783-88427c69b7eb\") " Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.523399 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities" (OuterVolumeSpecName: "utilities") pod "39167ad7-8b39-4c2b-b783-88427c69b7eb" (UID: "39167ad7-8b39-4c2b-b783-88427c69b7eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.525494 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj" (OuterVolumeSpecName: "kube-api-access-4vfqj") pod "39167ad7-8b39-4c2b-b783-88427c69b7eb" (UID: "39167ad7-8b39-4c2b-b783-88427c69b7eb"). InnerVolumeSpecName "kube-api-access-4vfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.536272 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.542398 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rtslc"] Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.575772 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39167ad7-8b39-4c2b-b783-88427c69b7eb" (UID: "39167ad7-8b39-4c2b-b783-88427c69b7eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.623821 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.623865 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vfqj\" (UniqueName: \"kubernetes.io/projected/39167ad7-8b39-4c2b-b783-88427c69b7eb-kube-api-access-4vfqj\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:10 crc kubenswrapper[4759]: I1205 00:29:10.623876 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39167ad7-8b39-4c2b-b783-88427c69b7eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.132832 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l5772"] Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133393 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133406 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133421 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133427 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133439 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133452 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133465 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133473 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133481 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133488 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133499 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133506 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133518 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133525 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133536 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133543 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133552 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133557 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133564 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133571 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="extract-utilities" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133582 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133594 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133603 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133612 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: E1205 00:29:11.133621 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133628 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="extract-content" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133725 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" containerName="marketplace-operator" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133742 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133753 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133765 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c605086-ba35-4534-8732-246afbfde953" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.133777 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" containerName="registry-server" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.134270 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.146000 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l5772"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.161801 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031753f7-0b97-45ec-8e24-a6aeafb09d65" path="/var/lib/kubelet/pods/031753f7-0b97-45ec-8e24-a6aeafb09d65/volumes" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.162609 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c605086-ba35-4534-8732-246afbfde953" path="/var/lib/kubelet/pods/7c605086-ba35-4534-8732-246afbfde953/volumes" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.163419 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6019120-bf7b-47df-9e54-c7761066eb48" path="/var/lib/kubelet/pods/a6019120-bf7b-47df-9e54-c7761066eb48/volumes" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.164539 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac9e47c-1b1d-4b22-9040-3a198c5758fe" path="/var/lib/kubelet/pods/eac9e47c-1b1d-4b22-9040-3a198c5758fe/volumes" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.193396 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgc9q" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.193380 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgc9q" event={"ID":"39167ad7-8b39-4c2b-b783-88427c69b7eb","Type":"ContainerDied","Data":"3a3f8b9e7949311ca69089cca3085bbb5fc4d5be80b183af6e3a0ac6acd74fbe"} Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.193819 4759 scope.go:117] "RemoveContainer" containerID="87a1a8dba60dfff88e376723c7cfcaec94c55efd5890d53b7303c6b5cd4dfbad" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.199064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" event={"ID":"423b5d0f-1418-420b-80ca-f05d0087c85e","Type":"ContainerStarted","Data":"8ad3976bd5364036c3967378cfb928b69074766b8238b287013945d3d20a4fbd"} Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.199092 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" event={"ID":"423b5d0f-1418-420b-80ca-f05d0087c85e","Type":"ContainerStarted","Data":"fe89e5d9e5b54478124d1194e92d93a8de1c09c8a834f5a023f6abe8e9405291"} Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.200095 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.203263 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.211739 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.212527 4759 scope.go:117] "RemoveContainer" containerID="e83cb4675a2ff55a6f6c4f6da8bd3a7edd0f780e80c7ade403fdce0487502197" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.215179 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lgc9q"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231248 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231299 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-trusted-ca\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231629 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-certificates\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231652 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231669 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-bound-sa-token\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231694 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-tls\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231733 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2gf\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-kube-api-access-rj2gf\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.231759 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.232691 4759 scope.go:117] "RemoveContainer" containerID="c2f4fb9941d3d511464b337f646e42bc6c07be0c6417e05c04fafcd612ad78ce" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.235072 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n7tt2" podStartSLOduration=2.235039945 podStartE2EDuration="2.235039945s" podCreationTimestamp="2025-12-05 00:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:29:11.232785338 +0000 UTC m=+370.448446308" watchObservedRunningTime="2025-12-05 00:29:11.235039945 +0000 UTC m=+370.450700905" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.263179 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.345378 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2gf\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-kube-api-access-rj2gf\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.345504 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.345590 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-trusted-ca\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.345699 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-certificates\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.345783 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.346298 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.346428 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-bound-sa-token\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.346999 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-tls\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.350067 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-trusted-ca\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.352653 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.359172 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-tls\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.359769 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-registry-certificates\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.364909 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-bound-sa-token\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.365205 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2gf\" (UniqueName: \"kubernetes.io/projected/ccdfb9aa-32ae-4a06-aaec-28c3aac72597-kube-api-access-rj2gf\") pod \"image-registry-66df7c8f76-l5772\" (UID: \"ccdfb9aa-32ae-4a06-aaec-28c3aac72597\") " pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.449023 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.756285 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l5772"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.898560 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fhwcd"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.900112 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.902768 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.911825 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhwcd"] Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.963898 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-utilities\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.963966 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thk2\" (UniqueName: \"kubernetes.io/projected/5a767665-6baa-48cf-98b9-825fa8ff6b63-kube-api-access-6thk2\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:11 crc kubenswrapper[4759]: I1205 00:29:11.964020 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-catalog-content\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.065456 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-catalog-content\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.065556 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-utilities\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.065589 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thk2\" (UniqueName: \"kubernetes.io/projected/5a767665-6baa-48cf-98b9-825fa8ff6b63-kube-api-access-6thk2\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.066078 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-catalog-content\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.066154 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a767665-6baa-48cf-98b9-825fa8ff6b63-utilities\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.086340 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thk2\" (UniqueName: \"kubernetes.io/projected/5a767665-6baa-48cf-98b9-825fa8ff6b63-kube-api-access-6thk2\") pod \"certified-operators-fhwcd\" (UID: \"5a767665-6baa-48cf-98b9-825fa8ff6b63\") " pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.211810 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" event={"ID":"ccdfb9aa-32ae-4a06-aaec-28c3aac72597","Type":"ContainerStarted","Data":"b81d6d0eb11c31c45e388d67ce885bb17b03fc697ce61466118b38de0fc075ba"} Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.211862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" event={"ID":"ccdfb9aa-32ae-4a06-aaec-28c3aac72597","Type":"ContainerStarted","Data":"d7ec8e64d54b41297ee17fd08aeeeea6a943b07e4b70de2d2939195ca55698aa"} Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.212092 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.225576 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.236026 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" podStartSLOduration=1.236005984 podStartE2EDuration="1.236005984s" podCreationTimestamp="2025-12-05 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:29:12.231237975 +0000 UTC m=+371.446898925" watchObservedRunningTime="2025-12-05 00:29:12.236005984 +0000 UTC m=+371.451666944" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.409211 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhwcd"] Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.897558 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqxx5"] Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.898888 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.902110 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.913167 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqxx5"] Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.975740 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-catalog-content\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.976018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-utilities\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:12 crc kubenswrapper[4759]: I1205 00:29:12.976119 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22c7g\" (UniqueName: \"kubernetes.io/projected/cc48a8ea-811f-4524-aebb-6518efb9c7f5-kube-api-access-22c7g\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.077574 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-catalog-content\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.077645 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-utilities\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.077676 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22c7g\" (UniqueName: \"kubernetes.io/projected/cc48a8ea-811f-4524-aebb-6518efb9c7f5-kube-api-access-22c7g\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.078117 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-catalog-content\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.078521 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc48a8ea-811f-4524-aebb-6518efb9c7f5-utilities\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.099084 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22c7g\" (UniqueName: \"kubernetes.io/projected/cc48a8ea-811f-4524-aebb-6518efb9c7f5-kube-api-access-22c7g\") pod \"community-operators-vqxx5\" (UID: \"cc48a8ea-811f-4524-aebb-6518efb9c7f5\") " pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.162847 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39167ad7-8b39-4c2b-b783-88427c69b7eb" path="/var/lib/kubelet/pods/39167ad7-8b39-4c2b-b783-88427c69b7eb/volumes" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.218812 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.228966 4759 generic.go:334] "Generic (PLEG): container finished" podID="5a767665-6baa-48cf-98b9-825fa8ff6b63" containerID="c4560e02d432cdc6a555488e205e1b031c4418e28618f7e7e29f7f86d9718051" exitCode=0 Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.229031 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhwcd" event={"ID":"5a767665-6baa-48cf-98b9-825fa8ff6b63","Type":"ContainerDied","Data":"c4560e02d432cdc6a555488e205e1b031c4418e28618f7e7e29f7f86d9718051"} Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.229067 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhwcd" event={"ID":"5a767665-6baa-48cf-98b9-825fa8ff6b63","Type":"ContainerStarted","Data":"f7105e2455e83099b66c39e0c44578a0e3ab4d6e931baddb0a7c3898a6900157"} Dec 05 00:29:13 crc kubenswrapper[4759]: I1205 00:29:13.621126 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqxx5"] Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.235535 4759 generic.go:334] "Generic (PLEG): container finished" podID="cc48a8ea-811f-4524-aebb-6518efb9c7f5" containerID="4119671c1d2afd1b66b40be525084a4974abf0187fe297606c19d311fabd193b" exitCode=0 Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.235645 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqxx5" event={"ID":"cc48a8ea-811f-4524-aebb-6518efb9c7f5","Type":"ContainerDied","Data":"4119671c1d2afd1b66b40be525084a4974abf0187fe297606c19d311fabd193b"} Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.235923 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqxx5" event={"ID":"cc48a8ea-811f-4524-aebb-6518efb9c7f5","Type":"ContainerStarted","Data":"53bac95961e3200214f732af13df097afa7919b0a39f4b57abe43d5a41fd84ed"} Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.238563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhwcd" event={"ID":"5a767665-6baa-48cf-98b9-825fa8ff6b63","Type":"ContainerStarted","Data":"c67f8faf452ebaa71e24b85137d0865e494b030fa10ec4c61afb2330dbd1f6b8"} Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.297776 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9l72t"] Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.299754 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.301486 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.312600 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l72t"] Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.405165 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-utilities\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.405281 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6lld\" (UniqueName: \"kubernetes.io/projected/a197c31d-778b-4261-bfd1-0469436747e5-kube-api-access-w6lld\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.405340 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-catalog-content\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.506940 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-utilities\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.507522 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6lld\" (UniqueName: \"kubernetes.io/projected/a197c31d-778b-4261-bfd1-0469436747e5-kube-api-access-w6lld\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.507553 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-catalog-content\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.507602 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-utilities\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.507850 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a197c31d-778b-4261-bfd1-0469436747e5-catalog-content\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.530714 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6lld\" (UniqueName: \"kubernetes.io/projected/a197c31d-778b-4261-bfd1-0469436747e5-kube-api-access-w6lld\") pod \"redhat-marketplace-9l72t\" (UID: \"a197c31d-778b-4261-bfd1-0469436747e5\") " pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.616755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:14 crc kubenswrapper[4759]: I1205 00:29:14.812586 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l72t"] Dec 05 00:29:14 crc kubenswrapper[4759]: W1205 00:29:14.818927 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda197c31d_778b_4261_bfd1_0469436747e5.slice/crio-5f12923e4a098064e0bde40a3c16fd081314e451ae6542a2eda909258d7c961e WatchSource:0}: Error finding container 5f12923e4a098064e0bde40a3c16fd081314e451ae6542a2eda909258d7c961e: Status 404 returned error can't find the container with id 5f12923e4a098064e0bde40a3c16fd081314e451ae6542a2eda909258d7c961e Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.247292 4759 generic.go:334] "Generic (PLEG): container finished" podID="5a767665-6baa-48cf-98b9-825fa8ff6b63" containerID="c67f8faf452ebaa71e24b85137d0865e494b030fa10ec4c61afb2330dbd1f6b8" exitCode=0 Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.247404 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhwcd" event={"ID":"5a767665-6baa-48cf-98b9-825fa8ff6b63","Type":"ContainerDied","Data":"c67f8faf452ebaa71e24b85137d0865e494b030fa10ec4c61afb2330dbd1f6b8"} Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.252521 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l72t" event={"ID":"a197c31d-778b-4261-bfd1-0469436747e5","Type":"ContainerStarted","Data":"5f12923e4a098064e0bde40a3c16fd081314e451ae6542a2eda909258d7c961e"} Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.294805 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7bfqq"] Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.296020 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.298228 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.308205 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bfqq"] Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.422610 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p577l\" (UniqueName: \"kubernetes.io/projected/af5bad0e-3c28-41e5-bd38-a9251291150c-kube-api-access-p577l\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.422668 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-utilities\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.422779 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-catalog-content\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.523783 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-catalog-content\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.523868 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p577l\" (UniqueName: \"kubernetes.io/projected/af5bad0e-3c28-41e5-bd38-a9251291150c-kube-api-access-p577l\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.523902 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-utilities\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.524445 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-catalog-content\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.524496 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bad0e-3c28-41e5-bd38-a9251291150c-utilities\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.546746 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p577l\" (UniqueName: \"kubernetes.io/projected/af5bad0e-3c28-41e5-bd38-a9251291150c-kube-api-access-p577l\") pod \"redhat-operators-7bfqq\" (UID: \"af5bad0e-3c28-41e5-bd38-a9251291150c\") " pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: I1205 00:29:15.652458 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:15 crc kubenswrapper[4759]: E1205 00:29:15.759561 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda197c31d_778b_4261_bfd1_0469436747e5.slice/crio-90969fde0ec1a2c973e370f5e66a501196c8beba5953c73b1ef28c4670481651.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:29:16 crc kubenswrapper[4759]: I1205 00:29:16.057161 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7bfqq"] Dec 05 00:29:16 crc kubenswrapper[4759]: I1205 00:29:16.266035 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqxx5" event={"ID":"cc48a8ea-811f-4524-aebb-6518efb9c7f5","Type":"ContainerStarted","Data":"d002ba298e241f1fa430016afaf72bafe9cbd0140c80d4c6f6c2830e2068f9fc"} Dec 05 00:29:16 crc kubenswrapper[4759]: I1205 00:29:16.270396 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bfqq" event={"ID":"af5bad0e-3c28-41e5-bd38-a9251291150c","Type":"ContainerStarted","Data":"a20f2a1c436bac8e00b488d4d54f71a40b5ec229043c65522d9bd8bea4e4f2e5"} Dec 05 00:29:16 crc kubenswrapper[4759]: I1205 00:29:16.274492 4759 generic.go:334] "Generic (PLEG): container finished" podID="a197c31d-778b-4261-bfd1-0469436747e5" containerID="90969fde0ec1a2c973e370f5e66a501196c8beba5953c73b1ef28c4670481651" exitCode=0 Dec 05 00:29:16 crc kubenswrapper[4759]: I1205 00:29:16.274540 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l72t" event={"ID":"a197c31d-778b-4261-bfd1-0469436747e5","Type":"ContainerDied","Data":"90969fde0ec1a2c973e370f5e66a501196c8beba5953c73b1ef28c4670481651"} Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.281860 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhwcd" event={"ID":"5a767665-6baa-48cf-98b9-825fa8ff6b63","Type":"ContainerStarted","Data":"e859851ad1cc7c93b67538216571b864355cf3ee101188da56e2c98f0222c4c9"} Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.285035 4759 generic.go:334] "Generic (PLEG): container finished" podID="cc48a8ea-811f-4524-aebb-6518efb9c7f5" containerID="d002ba298e241f1fa430016afaf72bafe9cbd0140c80d4c6f6c2830e2068f9fc" exitCode=0 Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.285127 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqxx5" event={"ID":"cc48a8ea-811f-4524-aebb-6518efb9c7f5","Type":"ContainerDied","Data":"d002ba298e241f1fa430016afaf72bafe9cbd0140c80d4c6f6c2830e2068f9fc"} Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.286685 4759 generic.go:334] "Generic (PLEG): container finished" podID="af5bad0e-3c28-41e5-bd38-a9251291150c" containerID="b8088a6c1006cbcf6e2734219a7ef99e1f8a355d021dbca1c1ebfa5effa7af8b" exitCode=0 Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.286717 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bfqq" event={"ID":"af5bad0e-3c28-41e5-bd38-a9251291150c","Type":"ContainerDied","Data":"b8088a6c1006cbcf6e2734219a7ef99e1f8a355d021dbca1c1ebfa5effa7af8b"} Dec 05 00:29:17 crc kubenswrapper[4759]: I1205 00:29:17.311480 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fhwcd" podStartSLOduration=3.356574309 podStartE2EDuration="6.311462898s" podCreationTimestamp="2025-12-05 00:29:11 +0000 UTC" firstStartedPulling="2025-12-05 00:29:13.230324018 +0000 UTC m=+372.445984968" lastFinishedPulling="2025-12-05 00:29:16.185212607 +0000 UTC m=+375.400873557" observedRunningTime="2025-12-05 00:29:17.307803757 +0000 UTC m=+376.523464707" watchObservedRunningTime="2025-12-05 00:29:17.311462898 +0000 UTC m=+376.527123848" Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.302489 4759 generic.go:334] "Generic (PLEG): container finished" podID="a197c31d-778b-4261-bfd1-0469436747e5" containerID="af68102a3b24b0c187ca7d383c034d98e09c5937a3657fd4f42aacb9c89ddcb2" exitCode=0 Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.302581 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l72t" event={"ID":"a197c31d-778b-4261-bfd1-0469436747e5","Type":"ContainerDied","Data":"af68102a3b24b0c187ca7d383c034d98e09c5937a3657fd4f42aacb9c89ddcb2"} Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.305603 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqxx5" event={"ID":"cc48a8ea-811f-4524-aebb-6518efb9c7f5","Type":"ContainerStarted","Data":"cc514964f784b65e4c9f2cd24c55c138f03cdfb5b9b6b6da7f0a85ee7cc20390"} Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.310262 4759 generic.go:334] "Generic (PLEG): container finished" podID="af5bad0e-3c28-41e5-bd38-a9251291150c" containerID="d7c82ed661475b3a22c612fe611495b0bff02231012b24aedbc016eadf47ced2" exitCode=0 Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.310324 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bfqq" event={"ID":"af5bad0e-3c28-41e5-bd38-a9251291150c","Type":"ContainerDied","Data":"d7c82ed661475b3a22c612fe611495b0bff02231012b24aedbc016eadf47ced2"} Dec 05 00:29:20 crc kubenswrapper[4759]: I1205 00:29:20.354352 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqxx5" podStartSLOduration=3.669848956 podStartE2EDuration="8.354329956s" podCreationTimestamp="2025-12-05 00:29:12 +0000 UTC" firstStartedPulling="2025-12-05 00:29:14.237172555 +0000 UTC m=+373.452833505" lastFinishedPulling="2025-12-05 00:29:18.921653545 +0000 UTC m=+378.137314505" observedRunningTime="2025-12-05 00:29:20.343902415 +0000 UTC m=+379.559563365" watchObservedRunningTime="2025-12-05 00:29:20.354329956 +0000 UTC m=+379.569990906" Dec 05 00:29:21 crc kubenswrapper[4759]: I1205 00:29:21.318022 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l72t" event={"ID":"a197c31d-778b-4261-bfd1-0469436747e5","Type":"ContainerStarted","Data":"1b9aa7494cd5c1f5b914e7a9a2a73cb4b4c14e5ca2214efab3bc9c2f2de28b4b"} Dec 05 00:29:21 crc kubenswrapper[4759]: I1205 00:29:21.321098 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7bfqq" event={"ID":"af5bad0e-3c28-41e5-bd38-a9251291150c","Type":"ContainerStarted","Data":"4c88e7884a36fb6b5d71aa6f402e373e5e52018ebdb0ff2de35f98b1bd0ba035"} Dec 05 00:29:21 crc kubenswrapper[4759]: I1205 00:29:21.352815 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9l72t" podStartSLOduration=2.699562363 podStartE2EDuration="7.352786873s" podCreationTimestamp="2025-12-05 00:29:14 +0000 UTC" firstStartedPulling="2025-12-05 00:29:16.27774783 +0000 UTC m=+375.493408780" lastFinishedPulling="2025-12-05 00:29:20.93097232 +0000 UTC m=+380.146633290" observedRunningTime="2025-12-05 00:29:21.341933502 +0000 UTC m=+380.557594452" watchObservedRunningTime="2025-12-05 00:29:21.352786873 +0000 UTC m=+380.568447823" Dec 05 00:29:21 crc kubenswrapper[4759]: I1205 00:29:21.364895 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7bfqq" podStartSLOduration=2.959835525 podStartE2EDuration="6.364874185s" podCreationTimestamp="2025-12-05 00:29:15 +0000 UTC" firstStartedPulling="2025-12-05 00:29:17.288834313 +0000 UTC m=+376.504495263" lastFinishedPulling="2025-12-05 00:29:20.693872973 +0000 UTC m=+379.909533923" observedRunningTime="2025-12-05 00:29:21.362630019 +0000 UTC m=+380.578290969" watchObservedRunningTime="2025-12-05 00:29:21.364874185 +0000 UTC m=+380.580535125" Dec 05 00:29:22 crc kubenswrapper[4759]: I1205 00:29:22.226833 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:22 crc kubenswrapper[4759]: I1205 00:29:22.226890 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:22 crc kubenswrapper[4759]: I1205 00:29:22.264902 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:22 crc kubenswrapper[4759]: I1205 00:29:22.369132 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fhwcd" Dec 05 00:29:23 crc kubenswrapper[4759]: I1205 00:29:23.219206 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:23 crc kubenswrapper[4759]: I1205 00:29:23.219584 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:23 crc kubenswrapper[4759]: I1205 00:29:23.261553 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:24 crc kubenswrapper[4759]: I1205 00:29:24.617425 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:24 crc kubenswrapper[4759]: I1205 00:29:24.618582 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:24 crc kubenswrapper[4759]: I1205 00:29:24.659169 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:24 crc kubenswrapper[4759]: I1205 00:29:24.966711 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:29:24 crc kubenswrapper[4759]: I1205 00:29:24.966931 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerName="route-controller-manager" containerID="cri-o://631a340cfc16741fd63583eb7e88ccd6a8c79fb9b77de7331a802936caff95b1" gracePeriod=30 Dec 05 00:29:25 crc kubenswrapper[4759]: I1205 00:29:25.653076 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:25 crc kubenswrapper[4759]: I1205 00:29:25.653561 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:26 crc kubenswrapper[4759]: I1205 00:29:26.401368 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9l72t" Dec 05 00:29:26 crc kubenswrapper[4759]: I1205 00:29:26.690495 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7bfqq" podUID="af5bad0e-3c28-41e5-bd38-a9251291150c" containerName="registry-server" probeResult="failure" output=< Dec 05 00:29:26 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:29:26 crc kubenswrapper[4759]: > Dec 05 00:29:28 crc kubenswrapper[4759]: I1205 00:29:28.355013 4759 generic.go:334] "Generic (PLEG): container finished" podID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerID="631a340cfc16741fd63583eb7e88ccd6a8c79fb9b77de7331a802936caff95b1" exitCode=0 Dec 05 00:29:28 crc kubenswrapper[4759]: I1205 00:29:28.355086 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" event={"ID":"cba371bf-29fe-493f-b0a8-1616619e0f08","Type":"ContainerDied","Data":"631a340cfc16741fd63583eb7e88ccd6a8c79fb9b77de7331a802936caff95b1"} Dec 05 00:29:29 crc kubenswrapper[4759]: I1205 00:29:29.687553 4759 patch_prober.go:28] interesting pod/route-controller-manager-599c9649f6-57bkq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 05 00:29:29 crc kubenswrapper[4759]: I1205 00:29:29.687618 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.170160 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.201361 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr"] Dec 05 00:29:30 crc kubenswrapper[4759]: E1205 00:29:30.201631 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerName="route-controller-manager" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.201652 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerName="route-controller-manager" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.201773 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" containerName="route-controller-manager" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.202188 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.207705 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqr2\" (UniqueName: \"kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2\") pod \"cba371bf-29fe-493f-b0a8-1616619e0f08\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.207783 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert\") pod \"cba371bf-29fe-493f-b0a8-1616619e0f08\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.207833 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config\") pod \"cba371bf-29fe-493f-b0a8-1616619e0f08\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.208098 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr"] Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.207878 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca\") pod \"cba371bf-29fe-493f-b0a8-1616619e0f08\" (UID: \"cba371bf-29fe-493f-b0a8-1616619e0f08\") " Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.208239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-client-ca\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.208292 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v692t\" (UniqueName: \"kubernetes.io/projected/350ad693-2a37-45d5-8c6f-20440bc9bc4b-kube-api-access-v692t\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.208385 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-config\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.208427 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/350ad693-2a37-45d5-8c6f-20440bc9bc4b-serving-cert\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.209369 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config" (OuterVolumeSpecName: "config") pod "cba371bf-29fe-493f-b0a8-1616619e0f08" (UID: "cba371bf-29fe-493f-b0a8-1616619e0f08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.209475 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca" (OuterVolumeSpecName: "client-ca") pod "cba371bf-29fe-493f-b0a8-1616619e0f08" (UID: "cba371bf-29fe-493f-b0a8-1616619e0f08"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.232881 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2" (OuterVolumeSpecName: "kube-api-access-lrqr2") pod "cba371bf-29fe-493f-b0a8-1616619e0f08" (UID: "cba371bf-29fe-493f-b0a8-1616619e0f08"). InnerVolumeSpecName "kube-api-access-lrqr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.244107 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cba371bf-29fe-493f-b0a8-1616619e0f08" (UID: "cba371bf-29fe-493f-b0a8-1616619e0f08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.309781 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-config\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.309918 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/350ad693-2a37-45d5-8c6f-20440bc9bc4b-serving-cert\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310725 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-client-ca\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310804 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v692t\" (UniqueName: \"kubernetes.io/projected/350ad693-2a37-45d5-8c6f-20440bc9bc4b-kube-api-access-v692t\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310890 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310918 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cba371bf-29fe-493f-b0a8-1616619e0f08-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310938 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrqr2\" (UniqueName: \"kubernetes.io/projected/cba371bf-29fe-493f-b0a8-1616619e0f08-kube-api-access-lrqr2\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310957 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba371bf-29fe-493f-b0a8-1616619e0f08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.310983 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-config\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.311457 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/350ad693-2a37-45d5-8c6f-20440bc9bc4b-client-ca\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.314276 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/350ad693-2a37-45d5-8c6f-20440bc9bc4b-serving-cert\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.326729 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v692t\" (UniqueName: \"kubernetes.io/projected/350ad693-2a37-45d5-8c6f-20440bc9bc4b-kube-api-access-v692t\") pod \"route-controller-manager-c55fc58c-tkjlr\" (UID: \"350ad693-2a37-45d5-8c6f-20440bc9bc4b\") " pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.367964 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" event={"ID":"cba371bf-29fe-493f-b0a8-1616619e0f08","Type":"ContainerDied","Data":"3b6d9c0d3f929aa392d719af98b5131ed220ec793fa9deb7120c97298b127514"} Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.368035 4759 scope.go:117] "RemoveContainer" containerID="631a340cfc16741fd63583eb7e88ccd6a8c79fb9b77de7331a802936caff95b1" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.368055 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.400921 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.404621 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c9649f6-57bkq"] Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.557779 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:30 crc kubenswrapper[4759]: I1205 00:29:30.981859 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr"] Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.163331 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba371bf-29fe-493f-b0a8-1616619e0f08" path="/var/lib/kubelet/pods/cba371bf-29fe-493f-b0a8-1616619e0f08/volumes" Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.374628 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" event={"ID":"350ad693-2a37-45d5-8c6f-20440bc9bc4b","Type":"ContainerStarted","Data":"8da6c71d0551a3c689a73291024c8dca394790311ed32f81cf25f929510615a2"} Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.374673 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" event={"ID":"350ad693-2a37-45d5-8c6f-20440bc9bc4b","Type":"ContainerStarted","Data":"aecb84ea342026d9583920628ea0d86793c7cdfb19efd9e5a06cc0548d6e0d50"} Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.376038 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.396938 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" podStartSLOduration=7.396914932 podStartE2EDuration="7.396914932s" podCreationTimestamp="2025-12-05 00:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:29:31.391506808 +0000 UTC m=+390.607167768" watchObservedRunningTime="2025-12-05 00:29:31.396914932 +0000 UTC m=+390.612575882" Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.444357 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c55fc58c-tkjlr" Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.457262 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l5772" Dec 05 00:29:31 crc kubenswrapper[4759]: I1205 00:29:31.528213 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:29:33 crc kubenswrapper[4759]: I1205 00:29:33.264429 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqxx5" Dec 05 00:29:34 crc kubenswrapper[4759]: I1205 00:29:34.433197 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:29:34 crc kubenswrapper[4759]: I1205 00:29:34.433274 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:29:34 crc kubenswrapper[4759]: I1205 00:29:34.433345 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:29:34 crc kubenswrapper[4759]: I1205 00:29:34.433941 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:29:34 crc kubenswrapper[4759]: I1205 00:29:34.434014 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8" gracePeriod=600 Dec 05 00:29:35 crc kubenswrapper[4759]: I1205 00:29:35.693606 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:35 crc kubenswrapper[4759]: I1205 00:29:35.735774 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7bfqq" Dec 05 00:29:36 crc kubenswrapper[4759]: I1205 00:29:36.401951 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8" exitCode=0 Dec 05 00:29:36 crc kubenswrapper[4759]: I1205 00:29:36.402028 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8"} Dec 05 00:29:36 crc kubenswrapper[4759]: I1205 00:29:36.402095 4759 scope.go:117] "RemoveContainer" containerID="3a5dcb73ac21f74b2aea74632b70e29e419059fe5cfa09153602008d77eab4fc" Dec 05 00:29:38 crc kubenswrapper[4759]: I1205 00:29:38.413970 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702"} Dec 05 00:29:44 crc kubenswrapper[4759]: I1205 00:29:44.954499 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:29:44 crc kubenswrapper[4759]: I1205 00:29:44.955710 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" podUID="c924045b-eac9-49b6-ab25-c256dd04fa74" containerName="controller-manager" containerID="cri-o://3cf22669fe6f774c404eaaae5c8853d884b4cf32e702bb8cb7339bdd1ad3c1ac" gracePeriod=30 Dec 05 00:29:45 crc kubenswrapper[4759]: I1205 00:29:45.461223 4759 generic.go:334] "Generic (PLEG): container finished" podID="c924045b-eac9-49b6-ab25-c256dd04fa74" containerID="3cf22669fe6f774c404eaaae5c8853d884b4cf32e702bb8cb7339bdd1ad3c1ac" exitCode=0 Dec 05 00:29:45 crc kubenswrapper[4759]: I1205 00:29:45.461563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" event={"ID":"c924045b-eac9-49b6-ab25-c256dd04fa74","Type":"ContainerDied","Data":"3cf22669fe6f774c404eaaae5c8853d884b4cf32e702bb8cb7339bdd1ad3c1ac"} Dec 05 00:29:45 crc kubenswrapper[4759]: I1205 00:29:45.878134 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.034524 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert\") pod \"c924045b-eac9-49b6-ab25-c256dd04fa74\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.034649 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles\") pod \"c924045b-eac9-49b6-ab25-c256dd04fa74\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.034703 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbz9p\" (UniqueName: \"kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p\") pod \"c924045b-eac9-49b6-ab25-c256dd04fa74\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.034761 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config\") pod \"c924045b-eac9-49b6-ab25-c256dd04fa74\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.034870 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca\") pod \"c924045b-eac9-49b6-ab25-c256dd04fa74\" (UID: \"c924045b-eac9-49b6-ab25-c256dd04fa74\") " Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.035475 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca" (OuterVolumeSpecName: "client-ca") pod "c924045b-eac9-49b6-ab25-c256dd04fa74" (UID: "c924045b-eac9-49b6-ab25-c256dd04fa74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.035580 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c924045b-eac9-49b6-ab25-c256dd04fa74" (UID: "c924045b-eac9-49b6-ab25-c256dd04fa74"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.036097 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config" (OuterVolumeSpecName: "config") pod "c924045b-eac9-49b6-ab25-c256dd04fa74" (UID: "c924045b-eac9-49b6-ab25-c256dd04fa74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.041058 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c924045b-eac9-49b6-ab25-c256dd04fa74" (UID: "c924045b-eac9-49b6-ab25-c256dd04fa74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.043963 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p" (OuterVolumeSpecName: "kube-api-access-qbz9p") pod "c924045b-eac9-49b6-ab25-c256dd04fa74" (UID: "c924045b-eac9-49b6-ab25-c256dd04fa74"). InnerVolumeSpecName "kube-api-access-qbz9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.135871 4759 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.135905 4759 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c924045b-eac9-49b6-ab25-c256dd04fa74-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.135915 4759 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.135930 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbz9p\" (UniqueName: \"kubernetes.io/projected/c924045b-eac9-49b6-ab25-c256dd04fa74-kube-api-access-qbz9p\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.135943 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c924045b-eac9-49b6-ab25-c256dd04fa74-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.432442 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb"] Dec 05 00:29:46 crc kubenswrapper[4759]: E1205 00:29:46.432688 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c924045b-eac9-49b6-ab25-c256dd04fa74" containerName="controller-manager" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.432721 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c924045b-eac9-49b6-ab25-c256dd04fa74" containerName="controller-manager" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.432843 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c924045b-eac9-49b6-ab25-c256dd04fa74" containerName="controller-manager" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.433339 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.451256 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb"] Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.478164 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" event={"ID":"c924045b-eac9-49b6-ab25-c256dd04fa74","Type":"ContainerDied","Data":"24fe584c425caeff91db016bb8422b4493e88169e2983fc8430951e5c7f422db"} Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.478226 4759 scope.go:117] "RemoveContainer" containerID="3cf22669fe6f774c404eaaae5c8853d884b4cf32e702bb8cb7339bdd1ad3c1ac" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.478230 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-986d5957d-jgvm6" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.523680 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.527815 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-986d5957d-jgvm6"] Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.541974 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-proxy-ca-bundles\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.542078 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-client-ca\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.542116 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-config\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.542173 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75006b53-124c-453b-8f41-ef77484aa612-serving-cert\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.542196 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qq4m\" (UniqueName: \"kubernetes.io/projected/75006b53-124c-453b-8f41-ef77484aa612-kube-api-access-6qq4m\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.643762 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75006b53-124c-453b-8f41-ef77484aa612-serving-cert\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.644491 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qq4m\" (UniqueName: \"kubernetes.io/projected/75006b53-124c-453b-8f41-ef77484aa612-kube-api-access-6qq4m\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.644570 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-proxy-ca-bundles\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.647728 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-proxy-ca-bundles\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.648952 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-client-ca\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.649065 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-config\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.650711 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-client-ca\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.652014 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75006b53-124c-453b-8f41-ef77484aa612-config\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.652413 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75006b53-124c-453b-8f41-ef77484aa612-serving-cert\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.679120 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qq4m\" (UniqueName: \"kubernetes.io/projected/75006b53-124c-453b-8f41-ef77484aa612-kube-api-access-6qq4m\") pod \"controller-manager-8699c6c5bb-d5tjb\" (UID: \"75006b53-124c-453b-8f41-ef77484aa612\") " pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:46 crc kubenswrapper[4759]: I1205 00:29:46.775831 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.021475 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb"] Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.161631 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c924045b-eac9-49b6-ab25-c256dd04fa74" path="/var/lib/kubelet/pods/c924045b-eac9-49b6-ab25-c256dd04fa74/volumes" Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.484293 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" event={"ID":"75006b53-124c-453b-8f41-ef77484aa612","Type":"ContainerStarted","Data":"7faf1d35d5515d0c63fa9aba3b6fcbe1392bcc4bf10634c3df182a3a063aa749"} Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.484356 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" event={"ID":"75006b53-124c-453b-8f41-ef77484aa612","Type":"ContainerStarted","Data":"449b43c12280354b3d7e29097ec9d1ebb0b7ca301df8778ebdb2719eab3e341b"} Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.485236 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.503893 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" Dec 05 00:29:47 crc kubenswrapper[4759]: I1205 00:29:47.532125 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8699c6c5bb-d5tjb" podStartSLOduration=3.532105951 podStartE2EDuration="3.532105951s" podCreationTimestamp="2025-12-05 00:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:29:47.510054961 +0000 UTC m=+406.725715911" watchObservedRunningTime="2025-12-05 00:29:47.532105951 +0000 UTC m=+406.747766901" Dec 05 00:29:56 crc kubenswrapper[4759]: I1205 00:29:56.581057 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" podUID="6171c662-5317-43a1-bc72-e0d9fbe54466" containerName="registry" containerID="cri-o://04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17" gracePeriod=30 Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.009861 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195212 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195278 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqtzd\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195329 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195369 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195406 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195442 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.195642 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.196008 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls\") pod \"6171c662-5317-43a1-bc72-e0d9fbe54466\" (UID: \"6171c662-5317-43a1-bc72-e0d9fbe54466\") " Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.196633 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.196897 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.201687 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.202599 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.203418 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd" (OuterVolumeSpecName: "kube-api-access-wqtzd") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "kube-api-access-wqtzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.203982 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.212144 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.213928 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6171c662-5317-43a1-bc72-e0d9fbe54466" (UID: "6171c662-5317-43a1-bc72-e0d9fbe54466"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298048 4759 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6171c662-5317-43a1-bc72-e0d9fbe54466-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298107 4759 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298125 4759 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298148 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqtzd\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-kube-api-access-wqtzd\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298166 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6171c662-5317-43a1-bc72-e0d9fbe54466-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298183 4759 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6171c662-5317-43a1-bc72-e0d9fbe54466-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.298200 4759 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6171c662-5317-43a1-bc72-e0d9fbe54466-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.555599 4759 generic.go:334] "Generic (PLEG): container finished" podID="6171c662-5317-43a1-bc72-e0d9fbe54466" containerID="04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17" exitCode=0 Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.555657 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" event={"ID":"6171c662-5317-43a1-bc72-e0d9fbe54466","Type":"ContainerDied","Data":"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17"} Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.555689 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" event={"ID":"6171c662-5317-43a1-bc72-e0d9fbe54466","Type":"ContainerDied","Data":"436fcc5104b5b363c1411d3768fcc900d4c0f299f1c54586d8656e5131f1207a"} Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.555702 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h2xln" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.555708 4759 scope.go:117] "RemoveContainer" containerID="04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.580142 4759 scope.go:117] "RemoveContainer" containerID="04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17" Dec 05 00:29:57 crc kubenswrapper[4759]: E1205 00:29:57.580694 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17\": container with ID starting with 04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17 not found: ID does not exist" containerID="04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.580753 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17"} err="failed to get container status \"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17\": rpc error: code = NotFound desc = could not find container \"04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17\": container with ID starting with 04f1d46a0bd4b3df19f2731ee786f71b30ef68d820784c5734eca7d394117e17 not found: ID does not exist" Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.614241 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:29:57 crc kubenswrapper[4759]: I1205 00:29:57.623332 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h2xln"] Dec 05 00:29:59 crc kubenswrapper[4759]: I1205 00:29:59.163878 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6171c662-5317-43a1-bc72-e0d9fbe54466" path="/var/lib/kubelet/pods/6171c662-5317-43a1-bc72-e0d9fbe54466/volumes" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.210814 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr"] Dec 05 00:30:00 crc kubenswrapper[4759]: E1205 00:30:00.211733 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6171c662-5317-43a1-bc72-e0d9fbe54466" containerName="registry" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.211749 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6171c662-5317-43a1-bc72-e0d9fbe54466" containerName="registry" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.211847 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6171c662-5317-43a1-bc72-e0d9fbe54466" containerName="registry" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.212620 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.215159 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.215206 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.217059 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr"] Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.341332 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6gz\" (UniqueName: \"kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.341415 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.341441 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.442429 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6gz\" (UniqueName: \"kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.442533 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.442574 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.444373 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.448490 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.463565 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6gz\" (UniqueName: \"kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz\") pod \"collect-profiles-29414910-gbhcr\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.532993 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:00 crc kubenswrapper[4759]: W1205 00:30:00.966831 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9382a4_5036_4fb4_850c_e5a26d299f02.slice/crio-b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2 WatchSource:0}: Error finding container b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2: Status 404 returned error can't find the container with id b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2 Dec 05 00:30:00 crc kubenswrapper[4759]: I1205 00:30:00.967794 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr"] Dec 05 00:30:01 crc kubenswrapper[4759]: I1205 00:30:01.584760 4759 generic.go:334] "Generic (PLEG): container finished" podID="4d9382a4-5036-4fb4-850c-e5a26d299f02" containerID="c95db5087895e2b94b651c69c0250b8e9fae547296d801bb002c598b5146ff73" exitCode=0 Dec 05 00:30:01 crc kubenswrapper[4759]: I1205 00:30:01.584941 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" event={"ID":"4d9382a4-5036-4fb4-850c-e5a26d299f02","Type":"ContainerDied","Data":"c95db5087895e2b94b651c69c0250b8e9fae547296d801bb002c598b5146ff73"} Dec 05 00:30:01 crc kubenswrapper[4759]: I1205 00:30:01.585067 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" event={"ID":"4d9382a4-5036-4fb4-850c-e5a26d299f02","Type":"ContainerStarted","Data":"b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2"} Dec 05 00:30:02 crc kubenswrapper[4759]: I1205 00:30:02.915682 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.073517 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume\") pod \"4d9382a4-5036-4fb4-850c-e5a26d299f02\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.073696 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume\") pod \"4d9382a4-5036-4fb4-850c-e5a26d299f02\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.073788 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf6gz\" (UniqueName: \"kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz\") pod \"4d9382a4-5036-4fb4-850c-e5a26d299f02\" (UID: \"4d9382a4-5036-4fb4-850c-e5a26d299f02\") " Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.074742 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume" (OuterVolumeSpecName: "config-volume") pod "4d9382a4-5036-4fb4-850c-e5a26d299f02" (UID: "4d9382a4-5036-4fb4-850c-e5a26d299f02"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.079718 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4d9382a4-5036-4fb4-850c-e5a26d299f02" (UID: "4d9382a4-5036-4fb4-850c-e5a26d299f02"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.079961 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz" (OuterVolumeSpecName: "kube-api-access-zf6gz") pod "4d9382a4-5036-4fb4-850c-e5a26d299f02" (UID: "4d9382a4-5036-4fb4-850c-e5a26d299f02"). InnerVolumeSpecName "kube-api-access-zf6gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.175066 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9382a4-5036-4fb4-850c-e5a26d299f02-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.175097 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf6gz\" (UniqueName: \"kubernetes.io/projected/4d9382a4-5036-4fb4-850c-e5a26d299f02-kube-api-access-zf6gz\") on node \"crc\" DevicePath \"\"" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.175109 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d9382a4-5036-4fb4-850c-e5a26d299f02-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.601550 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" event={"ID":"4d9382a4-5036-4fb4-850c-e5a26d299f02","Type":"ContainerDied","Data":"b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2"} Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.601627 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0334d0f23483317d93ee3da13cdc4f9878771948d42e3c73b143d31570002f2" Dec 05 00:30:03 crc kubenswrapper[4759]: I1205 00:30:03.601640 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr" Dec 05 00:32:04 crc kubenswrapper[4759]: I1205 00:32:04.433849 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:32:04 crc kubenswrapper[4759]: I1205 00:32:04.434569 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:32:34 crc kubenswrapper[4759]: I1205 00:32:34.433660 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:32:34 crc kubenswrapper[4759]: I1205 00:32:34.434463 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.433586 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.434223 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.434285 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.435030 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.435123 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702" gracePeriod=600 Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.683135 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702" exitCode=0 Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.683218 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702"} Dec 05 00:33:04 crc kubenswrapper[4759]: I1205 00:33:04.683648 4759 scope.go:117] "RemoveContainer" containerID="19e15a4dc41cf010fb0db81342a545243e6c32979a3aae139e39a7f8a97809a8" Dec 05 00:33:05 crc kubenswrapper[4759]: I1205 00:33:05.693987 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f"} Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.231619 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc"] Dec 05 00:34:59 crc kubenswrapper[4759]: E1205 00:34:59.232287 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9382a4-5036-4fb4-850c-e5a26d299f02" containerName="collect-profiles" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.232319 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9382a4-5036-4fb4-850c-e5a26d299f02" containerName="collect-profiles" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.232451 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9382a4-5036-4fb4-850c-e5a26d299f02" containerName="collect-profiles" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.233270 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.236125 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.250061 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc"] Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.401552 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.401662 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.401734 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fz67\" (UniqueName: \"kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.504160 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.504261 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fz67\" (UniqueName: \"kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.504400 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.505038 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.505119 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.540687 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fz67\" (UniqueName: \"kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.566590 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:34:59 crc kubenswrapper[4759]: I1205 00:34:59.792992 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc"] Dec 05 00:35:00 crc kubenswrapper[4759]: E1205 00:35:00.260494 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2043d5_dc62_4d55_908e_fdae23325471.slice/crio-conmon-ac542e17c93c692d84b9da39029a9eab24a0809784037c9efdadd08de60745e1.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:35:00 crc kubenswrapper[4759]: I1205 00:35:00.306874 4759 generic.go:334] "Generic (PLEG): container finished" podID="ab2043d5-dc62-4d55-908e-fdae23325471" containerID="ac542e17c93c692d84b9da39029a9eab24a0809784037c9efdadd08de60745e1" exitCode=0 Dec 05 00:35:00 crc kubenswrapper[4759]: I1205 00:35:00.306927 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" event={"ID":"ab2043d5-dc62-4d55-908e-fdae23325471","Type":"ContainerDied","Data":"ac542e17c93c692d84b9da39029a9eab24a0809784037c9efdadd08de60745e1"} Dec 05 00:35:00 crc kubenswrapper[4759]: I1205 00:35:00.306958 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" event={"ID":"ab2043d5-dc62-4d55-908e-fdae23325471","Type":"ContainerStarted","Data":"1217ecf936061c900494b29d6d9a19d60eddbd13678ef5b3748168fdd956b9eb"} Dec 05 00:35:00 crc kubenswrapper[4759]: I1205 00:35:00.308417 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:35:02 crc kubenswrapper[4759]: I1205 00:35:02.325053 4759 generic.go:334] "Generic (PLEG): container finished" podID="ab2043d5-dc62-4d55-908e-fdae23325471" containerID="3879f4a0c577a5b0dd2cad060555250ad62889819d4226b017a28b56dd5d21c4" exitCode=0 Dec 05 00:35:02 crc kubenswrapper[4759]: I1205 00:35:02.325152 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" event={"ID":"ab2043d5-dc62-4d55-908e-fdae23325471","Type":"ContainerDied","Data":"3879f4a0c577a5b0dd2cad060555250ad62889819d4226b017a28b56dd5d21c4"} Dec 05 00:35:03 crc kubenswrapper[4759]: I1205 00:35:03.334148 4759 generic.go:334] "Generic (PLEG): container finished" podID="ab2043d5-dc62-4d55-908e-fdae23325471" containerID="fab2cfcc2bc68c15e505cd98d7b9a18e5c58c4baa8197e443dd94fe4f7f0fbe9" exitCode=0 Dec 05 00:35:03 crc kubenswrapper[4759]: I1205 00:35:03.334238 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" event={"ID":"ab2043d5-dc62-4d55-908e-fdae23325471","Type":"ContainerDied","Data":"fab2cfcc2bc68c15e505cd98d7b9a18e5c58c4baa8197e443dd94fe4f7f0fbe9"} Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.433739 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.435601 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.644857 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.780237 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fz67\" (UniqueName: \"kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67\") pod \"ab2043d5-dc62-4d55-908e-fdae23325471\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.780284 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle\") pod \"ab2043d5-dc62-4d55-908e-fdae23325471\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.780390 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util\") pod \"ab2043d5-dc62-4d55-908e-fdae23325471\" (UID: \"ab2043d5-dc62-4d55-908e-fdae23325471\") " Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.782238 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle" (OuterVolumeSpecName: "bundle") pod "ab2043d5-dc62-4d55-908e-fdae23325471" (UID: "ab2043d5-dc62-4d55-908e-fdae23325471"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.788748 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67" (OuterVolumeSpecName: "kube-api-access-8fz67") pod "ab2043d5-dc62-4d55-908e-fdae23325471" (UID: "ab2043d5-dc62-4d55-908e-fdae23325471"). InnerVolumeSpecName "kube-api-access-8fz67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.796632 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util" (OuterVolumeSpecName: "util") pod "ab2043d5-dc62-4d55-908e-fdae23325471" (UID: "ab2043d5-dc62-4d55-908e-fdae23325471"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.881518 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.881763 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fz67\" (UniqueName: \"kubernetes.io/projected/ab2043d5-dc62-4d55-908e-fdae23325471-kube-api-access-8fz67\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:04 crc kubenswrapper[4759]: I1205 00:35:04.881828 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab2043d5-dc62-4d55-908e-fdae23325471-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:05 crc kubenswrapper[4759]: I1205 00:35:05.354050 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" event={"ID":"ab2043d5-dc62-4d55-908e-fdae23325471","Type":"ContainerDied","Data":"1217ecf936061c900494b29d6d9a19d60eddbd13678ef5b3748168fdd956b9eb"} Dec 05 00:35:05 crc kubenswrapper[4759]: I1205 00:35:05.354125 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1217ecf936061c900494b29d6d9a19d60eddbd13678ef5b3748168fdd956b9eb" Dec 05 00:35:05 crc kubenswrapper[4759]: I1205 00:35:05.354171 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.471056 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mbhwx"] Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.471867 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-controller" containerID="cri-o://b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.471951 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="nbdb" containerID="cri-o://56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.471971 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.471997 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-node" containerID="cri-o://efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.472069 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="sbdb" containerID="cri-o://7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.472119 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-acl-logging" containerID="cri-o://4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.472091 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="northd" containerID="cri-o://b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.512095 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" containerID="cri-o://5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" gracePeriod=30 Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.840028 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/3.log" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.841872 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovn-acl-logging/0.log" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.842231 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovn-controller/0.log" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.842712 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907475 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wlm8s"] Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907729 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="pull" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907743 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="pull" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907760 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="northd" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907768 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="northd" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907784 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907791 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907803 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907812 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907819 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kubecfg-setup" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907827 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kubecfg-setup" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907835 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907843 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907855 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-acl-logging" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907863 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-acl-logging" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907874 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907881 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907892 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907924 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907935 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="nbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907942 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="nbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907953 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-node" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907960 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-node" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907970 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907978 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.907990 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.907998 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.908008 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="extract" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908015 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="extract" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.908025 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="sbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908032 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="sbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: E1205 00:35:10.908044 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="util" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908052 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="util" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908157 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908168 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908177 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab2043d5-dc62-4d55-908e-fdae23325471" containerName="extract" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908188 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-node" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908198 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-acl-logging" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908210 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovn-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908221 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908230 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908238 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="northd" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908247 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908257 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="sbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908267 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="ovnkube-controller" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.908278 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerName="nbdb" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.911202 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973722 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973767 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973806 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973822 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973844 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973843 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973866 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973921 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973958 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t67cz\" (UniqueName: \"kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.973995 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974025 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974055 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974066 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974089 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974101 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log" (OuterVolumeSpecName: "node-log") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974114 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974139 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974194 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974137 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974221 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974221 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974152 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974204 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974248 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974273 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974282 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974354 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974384 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974414 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974437 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974453 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974458 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket\") pod \"45fa490b-1113-4ee6-9604-dc322ca11bd3\" (UID: \"45fa490b-1113-4ee6-9604-dc322ca11bd3\") " Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974481 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket" (OuterVolumeSpecName: "log-socket") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974502 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash" (OuterVolumeSpecName: "host-slash") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974524 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974614 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974621 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-slash\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974645 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-etc-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974669 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-ovn\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974759 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974758 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-netns\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974808 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-systemd-units\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974833 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974855 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-kubelet\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974914 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-env-overrides\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.974946 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-config\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975035 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-netd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975070 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-systemd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975098 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-script-lib\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975149 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-node-log\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975168 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovn-node-metrics-cert\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975190 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hc8p\" (UniqueName: \"kubernetes.io/projected/927e0a92-b426-4d74-bc7b-56a4baca4b61-kube-api-access-7hc8p\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975218 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-bin\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975242 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975260 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-var-lib-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975276 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-log-socket\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975293 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975371 4759 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975385 4759 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975396 4759 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975407 4759 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975427 4759 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975438 4759 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975448 4759 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975456 4759 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975464 4759 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975474 4759 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975485 4759 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975494 4759 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975507 4759 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975518 4759 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975529 4759 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975538 4759 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.975546 4759 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.979295 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.979318 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz" (OuterVolumeSpecName: "kube-api-access-t67cz") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "kube-api-access-t67cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:35:10 crc kubenswrapper[4759]: I1205 00:35:10.986243 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "45fa490b-1113-4ee6-9604-dc322ca11bd3" (UID: "45fa490b-1113-4ee6-9604-dc322ca11bd3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076445 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-bin\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076566 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076605 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-var-lib-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076647 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-log-socket\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076695 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076742 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-slash\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076782 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-etc-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076823 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-ovn\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076864 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-netns\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076902 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-systemd-units\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076938 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.076977 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-kubelet\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077023 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-env-overrides\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077064 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-config\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077122 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-netd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077182 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-systemd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077246 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-script-lib\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077350 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-node-log\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077401 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hc8p\" (UniqueName: \"kubernetes.io/projected/927e0a92-b426-4d74-bc7b-56a4baca4b61-kube-api-access-7hc8p\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077441 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovn-node-metrics-cert\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077515 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t67cz\" (UniqueName: \"kubernetes.io/projected/45fa490b-1113-4ee6-9604-dc322ca11bd3-kube-api-access-t67cz\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077551 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45fa490b-1113-4ee6-9604-dc322ca11bd3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.077574 4759 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45fa490b-1113-4ee6-9604-dc322ca11bd3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078353 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-systemd-units\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078434 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-bin\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078465 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078511 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-var-lib-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078522 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-cni-netd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078561 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078539 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-log-socket\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078593 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-slash\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078629 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-etc-openvswitch\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078630 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-ovn-kubernetes\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078666 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-ovn\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078705 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-kubelet\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.078736 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-host-run-netns\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.079572 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-script-lib\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.079689 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-run-systemd\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.079761 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-env-overrides\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.079766 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/927e0a92-b426-4d74-bc7b-56a4baca4b61-node-log\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.080170 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovnkube-config\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.083360 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/927e0a92-b426-4d74-bc7b-56a4baca4b61-ovn-node-metrics-cert\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.101441 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hc8p\" (UniqueName: \"kubernetes.io/projected/927e0a92-b426-4d74-bc7b-56a4baca4b61-kube-api-access-7hc8p\") pod \"ovnkube-node-wlm8s\" (UID: \"927e0a92-b426-4d74-bc7b-56a4baca4b61\") " pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.228085 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.386542 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovnkube-controller/3.log" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.389008 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovn-acl-logging/0.log" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.389531 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbhwx_45fa490b-1113-4ee6-9604-dc322ca11bd3/ovn-controller/0.log" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390076 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390114 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390148 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390159 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390171 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390179 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390191 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" exitCode=143 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390138 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390199 4759 generic.go:334] "Generic (PLEG): container finished" podID="45fa490b-1113-4ee6-9604-dc322ca11bd3" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" exitCode=143 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390147 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390427 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390461 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390475 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390483 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390486 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390616 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390633 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390647 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390653 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390660 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390666 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390673 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390680 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390687 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390694 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390703 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390714 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390721 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390731 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390738 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390744 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390751 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390758 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390764 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390771 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390778 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390786 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390796 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390804 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390811 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390818 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390824 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390830 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390837 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390844 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390850 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390876 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390892 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbhwx" event={"ID":"45fa490b-1113-4ee6-9604-dc322ca11bd3","Type":"ContainerDied","Data":"1cb4ef7ccc289586aa16f5b531c9a00f2cf64fb13c07df4d007ca3f84648072b"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390905 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390913 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390929 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390936 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390942 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390955 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390961 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390967 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390975 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.390981 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.392542 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/2.log" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.393104 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/1.log" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.393140 4759 generic.go:334] "Generic (PLEG): container finished" podID="b33957c4-8ef0-4b57-8e3c-183091f3b022" containerID="5a1fb84e174a1d7c7e9b9de4af4a4362a0de80ff5696218ca1ee9d80afd8d3a1" exitCode=2 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.393194 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerDied","Data":"5a1fb84e174a1d7c7e9b9de4af4a4362a0de80ff5696218ca1ee9d80afd8d3a1"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.393217 4759 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.393534 4759 scope.go:117] "RemoveContainer" containerID="5a1fb84e174a1d7c7e9b9de4af4a4362a0de80ff5696218ca1ee9d80afd8d3a1" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.396550 4759 generic.go:334] "Generic (PLEG): container finished" podID="927e0a92-b426-4d74-bc7b-56a4baca4b61" containerID="f14598e8315d86d447a0cd7a23be1a5b13ba5d85b6fc9f1862d57ddf8e34867b" exitCode=0 Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.396583 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerDied","Data":"f14598e8315d86d447a0cd7a23be1a5b13ba5d85b6fc9f1862d57ddf8e34867b"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.396602 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"322b6b23cd14e0d06f6f30bb1f0dcd2a6d6fd05731a31759fe9f4ef218fb6b6f"} Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.432764 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.489380 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mbhwx"] Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.489417 4759 scope.go:117] "RemoveContainer" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.491137 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mbhwx"] Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.511773 4759 scope.go:117] "RemoveContainer" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.533680 4759 scope.go:117] "RemoveContainer" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.553909 4759 scope.go:117] "RemoveContainer" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.580686 4759 scope.go:117] "RemoveContainer" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.611876 4759 scope.go:117] "RemoveContainer" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.667886 4759 scope.go:117] "RemoveContainer" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.695738 4759 scope.go:117] "RemoveContainer" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.716050 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.716538 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.716602 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} err="failed to get container status \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.716647 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.716991 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": container with ID starting with 9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531 not found: ID does not exist" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717026 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} err="failed to get container status \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": rpc error: code = NotFound desc = could not find container \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": container with ID starting with 9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717053 4759 scope.go:117] "RemoveContainer" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.717271 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": container with ID starting with 7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0 not found: ID does not exist" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717297 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} err="failed to get container status \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": rpc error: code = NotFound desc = could not find container \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": container with ID starting with 7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717323 4759 scope.go:117] "RemoveContainer" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.717522 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": container with ID starting with 56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9 not found: ID does not exist" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717544 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} err="failed to get container status \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": rpc error: code = NotFound desc = could not find container \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": container with ID starting with 56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.717558 4759 scope.go:117] "RemoveContainer" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.718288 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": container with ID starting with b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce not found: ID does not exist" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718331 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} err="failed to get container status \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": rpc error: code = NotFound desc = could not find container \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": container with ID starting with b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718347 4759 scope.go:117] "RemoveContainer" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.718595 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": container with ID starting with 52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129 not found: ID does not exist" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718620 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} err="failed to get container status \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": rpc error: code = NotFound desc = could not find container \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": container with ID starting with 52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718639 4759 scope.go:117] "RemoveContainer" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.718902 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": container with ID starting with efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48 not found: ID does not exist" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718928 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} err="failed to get container status \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": rpc error: code = NotFound desc = could not find container \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": container with ID starting with efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.718964 4759 scope.go:117] "RemoveContainer" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.719188 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": container with ID starting with 4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c not found: ID does not exist" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.719209 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} err="failed to get container status \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": rpc error: code = NotFound desc = could not find container \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": container with ID starting with 4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.719223 4759 scope.go:117] "RemoveContainer" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.719485 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": container with ID starting with b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a not found: ID does not exist" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.719513 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} err="failed to get container status \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": rpc error: code = NotFound desc = could not find container \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": container with ID starting with b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.719531 4759 scope.go:117] "RemoveContainer" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: E1205 00:35:11.722281 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": container with ID starting with 45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e not found: ID does not exist" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722328 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} err="failed to get container status \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": rpc error: code = NotFound desc = could not find container \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": container with ID starting with 45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722345 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722612 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} err="failed to get container status \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722633 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722868 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} err="failed to get container status \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": rpc error: code = NotFound desc = could not find container \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": container with ID starting with 9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.722897 4759 scope.go:117] "RemoveContainer" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723255 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} err="failed to get container status \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": rpc error: code = NotFound desc = could not find container \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": container with ID starting with 7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723280 4759 scope.go:117] "RemoveContainer" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723543 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} err="failed to get container status \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": rpc error: code = NotFound desc = could not find container \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": container with ID starting with 56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723569 4759 scope.go:117] "RemoveContainer" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723750 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} err="failed to get container status \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": rpc error: code = NotFound desc = could not find container \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": container with ID starting with b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723771 4759 scope.go:117] "RemoveContainer" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723949 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} err="failed to get container status \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": rpc error: code = NotFound desc = could not find container \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": container with ID starting with 52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.723968 4759 scope.go:117] "RemoveContainer" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724137 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} err="failed to get container status \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": rpc error: code = NotFound desc = could not find container \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": container with ID starting with efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724155 4759 scope.go:117] "RemoveContainer" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724345 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} err="failed to get container status \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": rpc error: code = NotFound desc = could not find container \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": container with ID starting with 4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724381 4759 scope.go:117] "RemoveContainer" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724582 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} err="failed to get container status \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": rpc error: code = NotFound desc = could not find container \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": container with ID starting with b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724603 4759 scope.go:117] "RemoveContainer" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724773 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} err="failed to get container status \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": rpc error: code = NotFound desc = could not find container \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": container with ID starting with 45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724792 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724972 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} err="failed to get container status \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.724997 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725210 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} err="failed to get container status \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": rpc error: code = NotFound desc = could not find container \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": container with ID starting with 9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725229 4759 scope.go:117] "RemoveContainer" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725438 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} err="failed to get container status \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": rpc error: code = NotFound desc = could not find container \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": container with ID starting with 7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725453 4759 scope.go:117] "RemoveContainer" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725620 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} err="failed to get container status \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": rpc error: code = NotFound desc = could not find container \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": container with ID starting with 56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725633 4759 scope.go:117] "RemoveContainer" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725954 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} err="failed to get container status \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": rpc error: code = NotFound desc = could not find container \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": container with ID starting with b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.725974 4759 scope.go:117] "RemoveContainer" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726194 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} err="failed to get container status \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": rpc error: code = NotFound desc = could not find container \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": container with ID starting with 52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726212 4759 scope.go:117] "RemoveContainer" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726443 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} err="failed to get container status \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": rpc error: code = NotFound desc = could not find container \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": container with ID starting with efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726457 4759 scope.go:117] "RemoveContainer" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726639 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} err="failed to get container status \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": rpc error: code = NotFound desc = could not find container \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": container with ID starting with 4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726661 4759 scope.go:117] "RemoveContainer" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726860 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} err="failed to get container status \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": rpc error: code = NotFound desc = could not find container \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": container with ID starting with b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.726878 4759 scope.go:117] "RemoveContainer" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727077 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} err="failed to get container status \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": rpc error: code = NotFound desc = could not find container \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": container with ID starting with 45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727092 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727270 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} err="failed to get container status \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727284 4759 scope.go:117] "RemoveContainer" containerID="9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727914 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531"} err="failed to get container status \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": rpc error: code = NotFound desc = could not find container \"9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531\": container with ID starting with 9eaa46961c7892b51ca2ffe1dc9693968f8adfaf8cc578f2a164dddb8cd2c531 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.727941 4759 scope.go:117] "RemoveContainer" containerID="7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.729609 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0"} err="failed to get container status \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": rpc error: code = NotFound desc = could not find container \"7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0\": container with ID starting with 7a8223544e3ac2abca7566b429d6d2a1de3694d8b87ee64dc208557b5a22f3b0 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.729634 4759 scope.go:117] "RemoveContainer" containerID="56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.729852 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9"} err="failed to get container status \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": rpc error: code = NotFound desc = could not find container \"56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9\": container with ID starting with 56a6f649382d2dbd27b9ab393635f6b665a8e392839cfd02cf3ac7f81e4697b9 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.729869 4759 scope.go:117] "RemoveContainer" containerID="b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730037 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce"} err="failed to get container status \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": rpc error: code = NotFound desc = could not find container \"b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce\": container with ID starting with b570e59173051d48ae3fa4f8e9d95de808d4a46b57d07be15a89541ca6630bce not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730069 4759 scope.go:117] "RemoveContainer" containerID="52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730205 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129"} err="failed to get container status \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": rpc error: code = NotFound desc = could not find container \"52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129\": container with ID starting with 52bcdcd65221a31cb8ee1d582d3360f0d44d5c0d3e93ab3f47018dd480400129 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730237 4759 scope.go:117] "RemoveContainer" containerID="efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730387 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48"} err="failed to get container status \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": rpc error: code = NotFound desc = could not find container \"efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48\": container with ID starting with efa3c9e217414bba67c1708e79c9dc6fae4050a01a458700d7cc7960b058be48 not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730401 4759 scope.go:117] "RemoveContainer" containerID="4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730520 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c"} err="failed to get container status \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": rpc error: code = NotFound desc = could not find container \"4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c\": container with ID starting with 4700d33ec748f0a28ac152694c18ef817ba82fa7a8f8a98c1d1acbb06203570c not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730552 4759 scope.go:117] "RemoveContainer" containerID="b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730669 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a"} err="failed to get container status \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": rpc error: code = NotFound desc = could not find container \"b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a\": container with ID starting with b20b16e22278cbc2718a9d0609572f0e500491c3158c1f64612baae6cf9e273a not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730701 4759 scope.go:117] "RemoveContainer" containerID="45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730833 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e"} err="failed to get container status \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": rpc error: code = NotFound desc = could not find container \"45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e\": container with ID starting with 45073065e34922983c127dc789f54d6b49d9c3191fad7aaffd0425569898495e not found: ID does not exist" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.730865 4759 scope.go:117] "RemoveContainer" containerID="5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527" Dec 05 00:35:11 crc kubenswrapper[4759]: I1205 00:35:11.731175 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527"} err="failed to get container status \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": rpc error: code = NotFound desc = could not find container \"5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527\": container with ID starting with 5687f9c9f66e5b1d77458ccfab224226aa8b2eb8299095afd40a332e1d14f527 not found: ID does not exist" Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.403687 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/2.log" Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.404464 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/1.log" Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.404550 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-llpn6" event={"ID":"b33957c4-8ef0-4b57-8e3c-183091f3b022","Type":"ContainerStarted","Data":"00e8e0fcbec00457c1acd0d348d30fe4cc1ab8211ab488635722434c67d2c64a"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408569 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"63c8d709df30abe02f20b1e50cd51f656bbc04926114f028ea8b5524630cb611"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408615 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"3d7a7b228afe52841452265e2eb226b956616eee02205059da16f370e77c8905"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408626 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"8b5666fe35a0b2ab416ff0ff00a8a365301dbc7cb1d8b6fbd9c61636e4617891"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408635 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"105b13b42c5b23a24dc40875710cb5531a5468e1e29a441c8d2f55feb3dc8d29"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408644 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"65726a94d60fe22143b7fbf1cafaa91817f562e699a132ec4459ebc6e7cff032"} Dec 05 00:35:12 crc kubenswrapper[4759]: I1205 00:35:12.408652 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"4fb348d6792e48f93131c8b8f27c196b813e80a947b22219d26b0dce32f85abf"} Dec 05 00:35:13 crc kubenswrapper[4759]: I1205 00:35:13.163555 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fa490b-1113-4ee6-9604-dc322ca11bd3" path="/var/lib/kubelet/pods/45fa490b-1113-4ee6-9604-dc322ca11bd3/volumes" Dec 05 00:35:15 crc kubenswrapper[4759]: I1205 00:35:15.427954 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"a4d0f4c1c17fa78d435320265dd6902c007693d252b81e173dfeda25af79d44e"} Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.454279 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" event={"ID":"927e0a92-b426-4d74-bc7b-56a4baca4b61","Type":"ContainerStarted","Data":"ff1d0edc9904666fbcf4582fc6caaad13aec75f67fc39c0779240aa4cec1e082"} Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.454683 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.454702 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.454713 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.493437 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" podStartSLOduration=7.493405293 podStartE2EDuration="7.493405293s" podCreationTimestamp="2025-12-05 00:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:35:17.489560018 +0000 UTC m=+736.705220978" watchObservedRunningTime="2025-12-05 00:35:17.493405293 +0000 UTC m=+736.709066243" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.504507 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.505377 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.807111 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd"] Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.807755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.810524 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.810592 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.810604 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hjgcq" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.885398 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbzlv\" (UniqueName: \"kubernetes.io/projected/b21d3e23-0940-4825-801e-ae74255085bd-kube-api-access-gbzlv\") pod \"obo-prometheus-operator-668cf9dfbb-sc8sd\" (UID: \"b21d3e23-0940-4825-801e-ae74255085bd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.893844 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd"] Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.950127 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb"] Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.951867 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.957134 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6cwd2" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.957415 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.967585 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb"] Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.974735 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c"] Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.976144 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.990951 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.995879 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbzlv\" (UniqueName: \"kubernetes.io/projected/b21d3e23-0940-4825-801e-ae74255085bd-kube-api-access-gbzlv\") pod \"obo-prometheus-operator-668cf9dfbb-sc8sd\" (UID: \"b21d3e23-0940-4825-801e-ae74255085bd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:17 crc kubenswrapper[4759]: I1205 00:35:17.996052 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.012532 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c"] Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.042347 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbzlv\" (UniqueName: \"kubernetes.io/projected/b21d3e23-0940-4825-801e-ae74255085bd-kube-api-access-gbzlv\") pod \"obo-prometheus-operator-668cf9dfbb-sc8sd\" (UID: \"b21d3e23-0940-4825-801e-ae74255085bd\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.072578 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-6pmzz"] Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.073451 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.077982 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-cgcdk" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.078874 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.095212 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-6pmzz"] Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.097814 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d87236a6-f3c6-470f-a197-05846a9b0c22-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.097914 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.098029 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.098086 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.098118 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.098217 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdsh\" (UniqueName: \"kubernetes.io/projected/d87236a6-f3c6-470f-a197-05846a9b0c22-kube-api-access-npdsh\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.102715 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.105786 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94fbcc74-1faa-44a4-8ea9-36028cc96003-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb\" (UID: \"94fbcc74-1faa-44a4-8ea9-36028cc96003\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.130682 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.154417 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6c23dc329bfcd48d65d20218108b6dd742af09e4b47935667d3e5a7c7a5914f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.154507 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6c23dc329bfcd48d65d20218108b6dd742af09e4b47935667d3e5a7c7a5914f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.154534 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6c23dc329bfcd48d65d20218108b6dd742af09e4b47935667d3e5a7c7a5914f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.154609 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators(b21d3e23-0940-4825-801e-ae74255085bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators(b21d3e23-0940-4825-801e-ae74255085bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6c23dc329bfcd48d65d20218108b6dd742af09e4b47935667d3e5a7c7a5914f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" podUID="b21d3e23-0940-4825-801e-ae74255085bd" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.199741 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d87236a6-f3c6-470f-a197-05846a9b0c22-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.200047 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.200125 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.200344 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdsh\" (UniqueName: \"kubernetes.io/projected/d87236a6-f3c6-470f-a197-05846a9b0c22-kube-api-access-npdsh\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.206727 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.208241 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b9b1d9-e67c-4aad-a22a-496d348f5148-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c\" (UID: \"38b9b1d9-e67c-4aad-a22a-496d348f5148\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.208886 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d87236a6-f3c6-470f-a197-05846a9b0c22-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.235338 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdsh\" (UniqueName: \"kubernetes.io/projected/d87236a6-f3c6-470f-a197-05846a9b0c22-kube-api-access-npdsh\") pod \"observability-operator-d8bb48f5d-6pmzz\" (UID: \"d87236a6-f3c6-470f-a197-05846a9b0c22\") " pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.271087 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.291847 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qp8vs"] Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.292596 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.294103 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(2c13ba1314d4595f6343d3593766f7f5e6467320d43b370ee6615b083177af33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.294157 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(2c13ba1314d4595f6343d3593766f7f5e6467320d43b370ee6615b083177af33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.294176 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(2c13ba1314d4595f6343d3593766f7f5e6467320d43b370ee6615b083177af33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.294216 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators(94fbcc74-1faa-44a4-8ea9-36028cc96003)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators(94fbcc74-1faa-44a4-8ea9-36028cc96003)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(2c13ba1314d4595f6343d3593766f7f5e6467320d43b370ee6615b083177af33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" podUID="94fbcc74-1faa-44a4-8ea9-36028cc96003" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.294531 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-p4ftn" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.301485 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96032405-2b01-4177-895c-f26ca2d838a9-openshift-service-ca\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.301538 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2xnh\" (UniqueName: \"kubernetes.io/projected/96032405-2b01-4177-895c-f26ca2d838a9-kube-api-access-g2xnh\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.319629 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qp8vs"] Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.325077 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.345057 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4a846a6cff810a95e8f06dd996b1d2f9f99ab569b3e761ddb9f600f8b0b5f2ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.345152 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4a846a6cff810a95e8f06dd996b1d2f9f99ab569b3e761ddb9f600f8b0b5f2ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.345182 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4a846a6cff810a95e8f06dd996b1d2f9f99ab569b3e761ddb9f600f8b0b5f2ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.345247 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators(38b9b1d9-e67c-4aad-a22a-496d348f5148)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators(38b9b1d9-e67c-4aad-a22a-496d348f5148)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4a846a6cff810a95e8f06dd996b1d2f9f99ab569b3e761ddb9f600f8b0b5f2ab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" podUID="38b9b1d9-e67c-4aad-a22a-496d348f5148" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.394574 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.403954 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96032405-2b01-4177-895c-f26ca2d838a9-openshift-service-ca\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.404029 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2xnh\" (UniqueName: \"kubernetes.io/projected/96032405-2b01-4177-895c-f26ca2d838a9-kube-api-access-g2xnh\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.405090 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96032405-2b01-4177-895c-f26ca2d838a9-openshift-service-ca\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.417670 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(c2ce97ad75246c838d91eb23c74bc0130e35258806f8078ba36975ef9796aa9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.417772 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(c2ce97ad75246c838d91eb23c74bc0130e35258806f8078ba36975ef9796aa9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.417805 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(c2ce97ad75246c838d91eb23c74bc0130e35258806f8078ba36975ef9796aa9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.417905 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-6pmzz_openshift-operators(d87236a6-f3c6-470f-a197-05846a9b0c22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-6pmzz_openshift-operators(d87236a6-f3c6-470f-a197-05846a9b0c22)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(c2ce97ad75246c838d91eb23c74bc0130e35258806f8078ba36975ef9796aa9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" podUID="d87236a6-f3c6-470f-a197-05846a9b0c22" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.434103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2xnh\" (UniqueName: \"kubernetes.io/projected/96032405-2b01-4177-895c-f26ca2d838a9-kube-api-access-g2xnh\") pod \"perses-operator-5446b9c989-qp8vs\" (UID: \"96032405-2b01-4177-895c-f26ca2d838a9\") " pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.459962 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.460741 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.461191 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.461440 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.461612 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.461857 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.462751 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.462947 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.529708 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(04e4bb7409b11fb44a6b66fc8fbbafe12e02f33be42e84db0cefacf5903b0e91): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.529878 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(04e4bb7409b11fb44a6b66fc8fbbafe12e02f33be42e84db0cefacf5903b0e91): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.529925 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(04e4bb7409b11fb44a6b66fc8fbbafe12e02f33be42e84db0cefacf5903b0e91): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.530018 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators(94fbcc74-1faa-44a4-8ea9-36028cc96003)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators(94fbcc74-1faa-44a4-8ea9-36028cc96003)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_openshift-operators_94fbcc74-1faa-44a4-8ea9-36028cc96003_0(04e4bb7409b11fb44a6b66fc8fbbafe12e02f33be42e84db0cefacf5903b0e91): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" podUID="94fbcc74-1faa-44a4-8ea9-36028cc96003" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.536734 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6481899b131b442c877764b88dfdc7b74a06a3038daa523f4636a6d3329b45bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.536829 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6481899b131b442c877764b88dfdc7b74a06a3038daa523f4636a6d3329b45bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.536873 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6481899b131b442c877764b88dfdc7b74a06a3038daa523f4636a6d3329b45bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.536934 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators(b21d3e23-0940-4825-801e-ae74255085bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators(b21d3e23-0940-4825-801e-ae74255085bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-sc8sd_openshift-operators_b21d3e23-0940-4825-801e-ae74255085bd_0(6481899b131b442c877764b88dfdc7b74a06a3038daa523f4636a6d3329b45bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" podUID="b21d3e23-0940-4825-801e-ae74255085bd" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.543540 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4d297528b5d50b36b26b9e06a0c5967b6df0412212260a7788a193213a354e6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.543594 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4d297528b5d50b36b26b9e06a0c5967b6df0412212260a7788a193213a354e6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.543619 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4d297528b5d50b36b26b9e06a0c5967b6df0412212260a7788a193213a354e6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.543672 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators(38b9b1d9-e67c-4aad-a22a-496d348f5148)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators(38b9b1d9-e67c-4aad-a22a-496d348f5148)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_openshift-operators_38b9b1d9-e67c-4aad-a22a-496d348f5148_0(4d297528b5d50b36b26b9e06a0c5967b6df0412212260a7788a193213a354e6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" podUID="38b9b1d9-e67c-4aad-a22a-496d348f5148" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.560242 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(8f63b3f8c3ddcc965817e6ae270bdb3f69c09fd723509afd29579432d73d98bd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.560353 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(8f63b3f8c3ddcc965817e6ae270bdb3f69c09fd723509afd29579432d73d98bd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.560386 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(8f63b3f8c3ddcc965817e6ae270bdb3f69c09fd723509afd29579432d73d98bd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.560441 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-6pmzz_openshift-operators(d87236a6-f3c6-470f-a197-05846a9b0c22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-6pmzz_openshift-operators(d87236a6-f3c6-470f-a197-05846a9b0c22)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-6pmzz_openshift-operators_d87236a6-f3c6-470f-a197-05846a9b0c22_0(8f63b3f8c3ddcc965817e6ae270bdb3f69c09fd723509afd29579432d73d98bd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" podUID="d87236a6-f3c6-470f-a197-05846a9b0c22" Dec 05 00:35:18 crc kubenswrapper[4759]: I1205 00:35:18.610135 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.630922 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(e9716b157a9610bb7619736bd010d699c850c15d4356b0bf30f1d9ffcfee92b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.631052 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(e9716b157a9610bb7619736bd010d699c850c15d4356b0bf30f1d9ffcfee92b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.631088 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(e9716b157a9610bb7619736bd010d699c850c15d4356b0bf30f1d9ffcfee92b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:18 crc kubenswrapper[4759]: E1205 00:35:18.631171 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-qp8vs_openshift-operators(96032405-2b01-4177-895c-f26ca2d838a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-qp8vs_openshift-operators(96032405-2b01-4177-895c-f26ca2d838a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(e9716b157a9610bb7619736bd010d699c850c15d4356b0bf30f1d9ffcfee92b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" podUID="96032405-2b01-4177-895c-f26ca2d838a9" Dec 05 00:35:19 crc kubenswrapper[4759]: I1205 00:35:19.465039 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:19 crc kubenswrapper[4759]: I1205 00:35:19.466168 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:19 crc kubenswrapper[4759]: E1205 00:35:19.505336 4759 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(3d116e38ce9a8422034b4669bd8556754d09cd8f24750a4b9d48e204578c211e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 00:35:19 crc kubenswrapper[4759]: E1205 00:35:19.505461 4759 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(3d116e38ce9a8422034b4669bd8556754d09cd8f24750a4b9d48e204578c211e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:19 crc kubenswrapper[4759]: E1205 00:35:19.505507 4759 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(3d116e38ce9a8422034b4669bd8556754d09cd8f24750a4b9d48e204578c211e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:19 crc kubenswrapper[4759]: E1205 00:35:19.505588 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-qp8vs_openshift-operators(96032405-2b01-4177-895c-f26ca2d838a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-qp8vs_openshift-operators(96032405-2b01-4177-895c-f26ca2d838a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qp8vs_openshift-operators_96032405-2b01-4177-895c-f26ca2d838a9_0(3d116e38ce9a8422034b4669bd8556754d09cd8f24750a4b9d48e204578c211e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" podUID="96032405-2b01-4177-895c-f26ca2d838a9" Dec 05 00:35:29 crc kubenswrapper[4759]: I1205 00:35:29.155686 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:29 crc kubenswrapper[4759]: I1205 00:35:29.156724 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" Dec 05 00:35:29 crc kubenswrapper[4759]: I1205 00:35:29.577961 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c"] Dec 05 00:35:29 crc kubenswrapper[4759]: W1205 00:35:29.584507 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b9b1d9_e67c_4aad_a22a_496d348f5148.slice/crio-a68331869aefe352b7e830ccaa3214e6d9f0e77b9b5f59acae9e13d59bada709 WatchSource:0}: Error finding container a68331869aefe352b7e830ccaa3214e6d9f0e77b9b5f59acae9e13d59bada709: Status 404 returned error can't find the container with id a68331869aefe352b7e830ccaa3214e6d9f0e77b9b5f59acae9e13d59bada709 Dec 05 00:35:30 crc kubenswrapper[4759]: I1205 00:35:30.155232 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:30 crc kubenswrapper[4759]: I1205 00:35:30.155748 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" Dec 05 00:35:30 crc kubenswrapper[4759]: I1205 00:35:30.342629 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd"] Dec 05 00:35:30 crc kubenswrapper[4759]: W1205 00:35:30.354436 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb21d3e23_0940_4825_801e_ae74255085bd.slice/crio-63cc0387985bf709a2f666720f447bcf63366213e2e18f553f66b2de80951080 WatchSource:0}: Error finding container 63cc0387985bf709a2f666720f447bcf63366213e2e18f553f66b2de80951080: Status 404 returned error can't find the container with id 63cc0387985bf709a2f666720f447bcf63366213e2e18f553f66b2de80951080 Dec 05 00:35:30 crc kubenswrapper[4759]: I1205 00:35:30.518616 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" event={"ID":"b21d3e23-0940-4825-801e-ae74255085bd","Type":"ContainerStarted","Data":"63cc0387985bf709a2f666720f447bcf63366213e2e18f553f66b2de80951080"} Dec 05 00:35:30 crc kubenswrapper[4759]: I1205 00:35:30.520103 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" event={"ID":"38b9b1d9-e67c-4aad-a22a-496d348f5148","Type":"ContainerStarted","Data":"a68331869aefe352b7e830ccaa3214e6d9f0e77b9b5f59acae9e13d59bada709"} Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.155027 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.155070 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.156140 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.156392 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.155202 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:33 crc kubenswrapper[4759]: I1205 00:35:33.156780 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:34 crc kubenswrapper[4759]: I1205 00:35:34.433770 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:35:34 crc kubenswrapper[4759]: I1205 00:35:34.433838 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.486849 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-6pmzz"] Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.522781 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb"] Dec 05 00:35:38 crc kubenswrapper[4759]: W1205 00:35:38.526904 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94fbcc74_1faa_44a4_8ea9_36028cc96003.slice/crio-a873d1085e54a30e8637bd5f5db0e94f578bca0b53ffe893d4dab736d8919229 WatchSource:0}: Error finding container a873d1085e54a30e8637bd5f5db0e94f578bca0b53ffe893d4dab736d8919229: Status 404 returned error can't find the container with id a873d1085e54a30e8637bd5f5db0e94f578bca0b53ffe893d4dab736d8919229 Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.577555 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qp8vs"] Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.610081 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" event={"ID":"b21d3e23-0940-4825-801e-ae74255085bd","Type":"ContainerStarted","Data":"887c4775510e3a880851d275ef863c013daf7a4c0552e450ece35d24ebcf8327"} Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.649836 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sc8sd" podStartSLOduration=13.725793899 podStartE2EDuration="21.649813945s" podCreationTimestamp="2025-12-05 00:35:17 +0000 UTC" firstStartedPulling="2025-12-05 00:35:30.355982501 +0000 UTC m=+749.571643451" lastFinishedPulling="2025-12-05 00:35:38.280002547 +0000 UTC m=+757.495663497" observedRunningTime="2025-12-05 00:35:38.635850869 +0000 UTC m=+757.851511829" watchObservedRunningTime="2025-12-05 00:35:38.649813945 +0000 UTC m=+757.865474885" Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.668163 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" event={"ID":"38b9b1d9-e67c-4aad-a22a-496d348f5148","Type":"ContainerStarted","Data":"dc6e14d10bbee4abdc0e2c0890cdfb63456f31792993c618f7b51f0895af1313"} Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.677587 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" event={"ID":"d87236a6-f3c6-470f-a197-05846a9b0c22","Type":"ContainerStarted","Data":"1347356ce64f3b1d287357c554c4f6657a933bd19591e79c8a8dc3ea368c8574"} Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.687441 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" event={"ID":"94fbcc74-1faa-44a4-8ea9-36028cc96003","Type":"ContainerStarted","Data":"a873d1085e54a30e8637bd5f5db0e94f578bca0b53ffe893d4dab736d8919229"} Dec 05 00:35:38 crc kubenswrapper[4759]: I1205 00:35:38.694113 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c" podStartSLOduration=13.017917788 podStartE2EDuration="21.694099362s" podCreationTimestamp="2025-12-05 00:35:17 +0000 UTC" firstStartedPulling="2025-12-05 00:35:29.586012783 +0000 UTC m=+748.801673733" lastFinishedPulling="2025-12-05 00:35:38.262194357 +0000 UTC m=+757.477855307" observedRunningTime="2025-12-05 00:35:38.691635161 +0000 UTC m=+757.907296111" watchObservedRunningTime="2025-12-05 00:35:38.694099362 +0000 UTC m=+757.909760312" Dec 05 00:35:39 crc kubenswrapper[4759]: I1205 00:35:39.693610 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" event={"ID":"96032405-2b01-4177-895c-f26ca2d838a9","Type":"ContainerStarted","Data":"5f7a2c1286213aa889397635e968fa23dc532f308aadc6ffdb37d51c8d38301d"} Dec 05 00:35:39 crc kubenswrapper[4759]: I1205 00:35:39.696491 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" event={"ID":"94fbcc74-1faa-44a4-8ea9-36028cc96003","Type":"ContainerStarted","Data":"26efb43395bd43e219574624f183063d30774e4fc3f477f2d550b449351bb403"} Dec 05 00:35:39 crc kubenswrapper[4759]: I1205 00:35:39.723936 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb" podStartSLOduration=22.723913136 podStartE2EDuration="22.723913136s" podCreationTimestamp="2025-12-05 00:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:35:39.719179738 +0000 UTC m=+758.934840728" watchObservedRunningTime="2025-12-05 00:35:39.723913136 +0000 UTC m=+758.939574096" Dec 05 00:35:41 crc kubenswrapper[4759]: I1205 00:35:41.260007 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wlm8s" Dec 05 00:35:41 crc kubenswrapper[4759]: I1205 00:35:41.562377 4759 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 00:35:41 crc kubenswrapper[4759]: I1205 00:35:41.746167 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" event={"ID":"96032405-2b01-4177-895c-f26ca2d838a9","Type":"ContainerStarted","Data":"ee8e71fa11459063cd1a65de2ac6129cec2eaeada7444eb6efe0fe5f9e67e379"} Dec 05 00:35:41 crc kubenswrapper[4759]: I1205 00:35:41.747075 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:41 crc kubenswrapper[4759]: I1205 00:35:41.766818 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" podStartSLOduration=20.978028494 podStartE2EDuration="23.766800207s" podCreationTimestamp="2025-12-05 00:35:18 +0000 UTC" firstStartedPulling="2025-12-05 00:35:38.66006061 +0000 UTC m=+757.875721560" lastFinishedPulling="2025-12-05 00:35:41.448832323 +0000 UTC m=+760.664493273" observedRunningTime="2025-12-05 00:35:41.763825814 +0000 UTC m=+760.979486764" watchObservedRunningTime="2025-12-05 00:35:41.766800207 +0000 UTC m=+760.982461147" Dec 05 00:35:44 crc kubenswrapper[4759]: I1205 00:35:44.762746 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" event={"ID":"d87236a6-f3c6-470f-a197-05846a9b0c22","Type":"ContainerStarted","Data":"b82d9fba5fcf6f7f6bce9d506acabb7d399763f8febe1a8c397d16818486cda5"} Dec 05 00:35:44 crc kubenswrapper[4759]: I1205 00:35:44.763086 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:44 crc kubenswrapper[4759]: I1205 00:35:44.782465 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" podStartSLOduration=21.039755481 podStartE2EDuration="26.782440308s" podCreationTimestamp="2025-12-05 00:35:18 +0000 UTC" firstStartedPulling="2025-12-05 00:35:38.49859448 +0000 UTC m=+757.714255430" lastFinishedPulling="2025-12-05 00:35:44.241279307 +0000 UTC m=+763.456940257" observedRunningTime="2025-12-05 00:35:44.780166112 +0000 UTC m=+763.995827072" watchObservedRunningTime="2025-12-05 00:35:44.782440308 +0000 UTC m=+763.998101278" Dec 05 00:35:44 crc kubenswrapper[4759]: I1205 00:35:44.841885 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-6pmzz" Dec 05 00:35:48 crc kubenswrapper[4759]: I1205 00:35:48.614438 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-qp8vs" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.610734 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-p4smd"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.612115 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.630625 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.631498 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.631705 4759 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mgtq4" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.637173 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-p4smd"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.640447 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2ff\" (UniqueName: \"kubernetes.io/projected/a52b68b3-6d54-414e-8ab9-37788f4ec793-kube-api-access-ql2ff\") pod \"cert-manager-cainjector-7f985d654d-p4smd\" (UID: \"a52b68b3-6d54-414e-8ab9-37788f4ec793\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.672882 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4d8pk"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.673914 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4d8pk" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.680331 4759 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-l2l5m" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.686834 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8v7g"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.687546 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.690725 4759 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nzvnj" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.700364 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8v7g"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.706101 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4d8pk"] Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.741871 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8jj\" (UniqueName: \"kubernetes.io/projected/2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65-kube-api-access-cv8jj\") pod \"cert-manager-webhook-5655c58dd6-h8v7g\" (UID: \"2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.741929 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2ff\" (UniqueName: \"kubernetes.io/projected/a52b68b3-6d54-414e-8ab9-37788f4ec793-kube-api-access-ql2ff\") pod \"cert-manager-cainjector-7f985d654d-p4smd\" (UID: \"a52b68b3-6d54-414e-8ab9-37788f4ec793\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.741949 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pnl\" (UniqueName: \"kubernetes.io/projected/a45999cd-2b52-4802-8bf1-98905eb68923-kube-api-access-66pnl\") pod \"cert-manager-5b446d88c5-4d8pk\" (UID: \"a45999cd-2b52-4802-8bf1-98905eb68923\") " pod="cert-manager/cert-manager-5b446d88c5-4d8pk" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.766295 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2ff\" (UniqueName: \"kubernetes.io/projected/a52b68b3-6d54-414e-8ab9-37788f4ec793-kube-api-access-ql2ff\") pod \"cert-manager-cainjector-7f985d654d-p4smd\" (UID: \"a52b68b3-6d54-414e-8ab9-37788f4ec793\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.842750 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pnl\" (UniqueName: \"kubernetes.io/projected/a45999cd-2b52-4802-8bf1-98905eb68923-kube-api-access-66pnl\") pod \"cert-manager-5b446d88c5-4d8pk\" (UID: \"a45999cd-2b52-4802-8bf1-98905eb68923\") " pod="cert-manager/cert-manager-5b446d88c5-4d8pk" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.842841 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8jj\" (UniqueName: \"kubernetes.io/projected/2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65-kube-api-access-cv8jj\") pod \"cert-manager-webhook-5655c58dd6-h8v7g\" (UID: \"2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.861328 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pnl\" (UniqueName: \"kubernetes.io/projected/a45999cd-2b52-4802-8bf1-98905eb68923-kube-api-access-66pnl\") pod \"cert-manager-5b446d88c5-4d8pk\" (UID: \"a45999cd-2b52-4802-8bf1-98905eb68923\") " pod="cert-manager/cert-manager-5b446d88c5-4d8pk" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.867174 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8jj\" (UniqueName: \"kubernetes.io/projected/2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65-kube-api-access-cv8jj\") pod \"cert-manager-webhook-5655c58dd6-h8v7g\" (UID: \"2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.951678 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" Dec 05 00:35:52 crc kubenswrapper[4759]: I1205 00:35:52.997124 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4d8pk" Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.006943 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.550469 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4d8pk"] Dec 05 00:35:53 crc kubenswrapper[4759]: W1205 00:35:53.554873 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45999cd_2b52_4802_8bf1_98905eb68923.slice/crio-130263cfb5431c14e39658936acc2c11bae1c38f1424876fdfa3614da511fc5f WatchSource:0}: Error finding container 130263cfb5431c14e39658936acc2c11bae1c38f1424876fdfa3614da511fc5f: Status 404 returned error can't find the container with id 130263cfb5431c14e39658936acc2c11bae1c38f1424876fdfa3614da511fc5f Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.576328 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-p4smd"] Dec 05 00:35:53 crc kubenswrapper[4759]: W1205 00:35:53.581948 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda52b68b3_6d54_414e_8ab9_37788f4ec793.slice/crio-d1e0412a309fc9d0b5e540e54fb64f4d16160cfddb7fc98a54c7d6a0d8f8a35c WatchSource:0}: Error finding container d1e0412a309fc9d0b5e540e54fb64f4d16160cfddb7fc98a54c7d6a0d8f8a35c: Status 404 returned error can't find the container with id d1e0412a309fc9d0b5e540e54fb64f4d16160cfddb7fc98a54c7d6a0d8f8a35c Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.684204 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8v7g"] Dec 05 00:35:53 crc kubenswrapper[4759]: W1205 00:35:53.688075 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4a4c71_73fb_4b2d_aa99_aba26e4b4d65.slice/crio-919f26b365460281497092a480d77b7d84ca6381cf4e5b90a94e179561a246a2 WatchSource:0}: Error finding container 919f26b365460281497092a480d77b7d84ca6381cf4e5b90a94e179561a246a2: Status 404 returned error can't find the container with id 919f26b365460281497092a480d77b7d84ca6381cf4e5b90a94e179561a246a2 Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.836386 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" event={"ID":"2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65","Type":"ContainerStarted","Data":"919f26b365460281497092a480d77b7d84ca6381cf4e5b90a94e179561a246a2"} Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.837699 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" event={"ID":"a52b68b3-6d54-414e-8ab9-37788f4ec793","Type":"ContainerStarted","Data":"d1e0412a309fc9d0b5e540e54fb64f4d16160cfddb7fc98a54c7d6a0d8f8a35c"} Dec 05 00:35:53 crc kubenswrapper[4759]: I1205 00:35:53.838802 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4d8pk" event={"ID":"a45999cd-2b52-4802-8bf1-98905eb68923","Type":"ContainerStarted","Data":"130263cfb5431c14e39658936acc2c11bae1c38f1424876fdfa3614da511fc5f"} Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.882569 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" event={"ID":"2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65","Type":"ContainerStarted","Data":"c07bfc2df7fc34331ca57dbbb891d5305db12b982942a7996d110869bcd6bd6f"} Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.882938 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.884619 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" event={"ID":"a52b68b3-6d54-414e-8ab9-37788f4ec793","Type":"ContainerStarted","Data":"0526a103fb326cca2504c96ff61b4eafd6630195a6b7663b058b2df4ae24d311"} Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.886077 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4d8pk" event={"ID":"a45999cd-2b52-4802-8bf1-98905eb68923","Type":"ContainerStarted","Data":"f19a134bb68d83c2a8a83195516a25e36ce2118ecb55382b536f875ee18a3d4b"} Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.901602 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" podStartSLOduration=2.441500961 podStartE2EDuration="5.901575289s" podCreationTimestamp="2025-12-05 00:35:52 +0000 UTC" firstStartedPulling="2025-12-05 00:35:53.690256218 +0000 UTC m=+772.905917168" lastFinishedPulling="2025-12-05 00:35:57.150330546 +0000 UTC m=+776.365991496" observedRunningTime="2025-12-05 00:35:57.897734745 +0000 UTC m=+777.113395705" watchObservedRunningTime="2025-12-05 00:35:57.901575289 +0000 UTC m=+777.117236279" Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.916600 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-p4smd" podStartSLOduration=2.353324218 podStartE2EDuration="5.916579091s" podCreationTimestamp="2025-12-05 00:35:52 +0000 UTC" firstStartedPulling="2025-12-05 00:35:53.585634357 +0000 UTC m=+772.801295307" lastFinishedPulling="2025-12-05 00:35:57.14888923 +0000 UTC m=+776.364550180" observedRunningTime="2025-12-05 00:35:57.914246423 +0000 UTC m=+777.129907383" watchObservedRunningTime="2025-12-05 00:35:57.916579091 +0000 UTC m=+777.132240051" Dec 05 00:35:57 crc kubenswrapper[4759]: I1205 00:35:57.964124 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4d8pk" podStartSLOduration=2.376653186 podStartE2EDuration="5.964106888s" podCreationTimestamp="2025-12-05 00:35:52 +0000 UTC" firstStartedPulling="2025-12-05 00:35:53.566520413 +0000 UTC m=+772.782181353" lastFinishedPulling="2025-12-05 00:35:57.153974105 +0000 UTC m=+776.369635055" observedRunningTime="2025-12-05 00:35:57.96136871 +0000 UTC m=+777.177029670" watchObservedRunningTime="2025-12-05 00:35:57.964106888 +0000 UTC m=+777.179767838" Dec 05 00:36:01 crc kubenswrapper[4759]: I1205 00:36:01.318336 4759 scope.go:117] "RemoveContainer" containerID="4c277e547a8f7c1592cd0c3bebd793893aec29c78a484f47227d92770d1617e3" Dec 05 00:36:01 crc kubenswrapper[4759]: I1205 00:36:01.915851 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-llpn6_b33957c4-8ef0-4b57-8e3c-183091f3b022/kube-multus/2.log" Dec 05 00:36:03 crc kubenswrapper[4759]: I1205 00:36:03.009261 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8v7g" Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.433397 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.433745 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.433790 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.434322 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.434374 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f" gracePeriod=600 Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.936794 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f" exitCode=0 Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.936858 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f"} Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.937184 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b"} Dec 05 00:36:04 crc kubenswrapper[4759]: I1205 00:36:04.937210 4759 scope.go:117] "RemoveContainer" containerID="8291b2f562c8ceee6453c4a2ea4879386376a25a8f93dcefd90d8fd30f5e5702" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.217588 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2"] Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.219621 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.221731 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.236486 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2"] Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.336337 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.336494 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvcm\" (UniqueName: \"kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.336593 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.437248 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvcm\" (UniqueName: \"kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.437335 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.437378 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.437793 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.437920 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.458877 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvcm\" (UniqueName: \"kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.537338 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:30 crc kubenswrapper[4759]: I1205 00:36:30.751000 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2"] Dec 05 00:36:31 crc kubenswrapper[4759]: I1205 00:36:31.110720 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" event={"ID":"122b71c8-7720-4f98-b9b6-e68ca39986bf","Type":"ContainerStarted","Data":"5c4cbc3c01da2ce85f86443719b0aa910d0171253e97c66e93eecd5bc6570d99"} Dec 05 00:36:32 crc kubenswrapper[4759]: I1205 00:36:32.117378 4759 generic.go:334] "Generic (PLEG): container finished" podID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerID="fd5bd03edd685e264559800717d7d41215c3e7bda6c8372ed90da4e01be6d753" exitCode=0 Dec 05 00:36:32 crc kubenswrapper[4759]: I1205 00:36:32.117419 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" event={"ID":"122b71c8-7720-4f98-b9b6-e68ca39986bf","Type":"ContainerDied","Data":"fd5bd03edd685e264559800717d7d41215c3e7bda6c8372ed90da4e01be6d753"} Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.778647 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.780854 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.794134 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.891639 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.891695 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flttd\" (UniqueName: \"kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.891733 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.992338 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.992415 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.992436 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flttd\" (UniqueName: \"kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.992941 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:33 crc kubenswrapper[4759]: I1205 00:36:33.992996 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:34 crc kubenswrapper[4759]: I1205 00:36:34.014412 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flttd\" (UniqueName: \"kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd\") pod \"redhat-operators-hbzz6\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:34 crc kubenswrapper[4759]: I1205 00:36:34.103724 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:34 crc kubenswrapper[4759]: I1205 00:36:34.351302 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:35 crc kubenswrapper[4759]: I1205 00:36:35.134758 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerID="6e73cc93c7e65b092f30d2f4025d1d0206349d3c34a339025f2a1e8f178e94f0" exitCode=0 Dec 05 00:36:35 crc kubenswrapper[4759]: I1205 00:36:35.134806 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerDied","Data":"6e73cc93c7e65b092f30d2f4025d1d0206349d3c34a339025f2a1e8f178e94f0"} Dec 05 00:36:35 crc kubenswrapper[4759]: I1205 00:36:35.135068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerStarted","Data":"8c7fc7ad7b84c6d1989918ed006e0b04e005239fed8cd9381b3b2f139dbbf494"} Dec 05 00:36:36 crc kubenswrapper[4759]: I1205 00:36:36.144471 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerStarted","Data":"d1c27c59c154ebaf4ad111c1373e5c3058460fccf23da3ec2c360c41e5bcc554"} Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.152363 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerID="d1c27c59c154ebaf4ad111c1373e5c3058460fccf23da3ec2c360c41e5bcc554" exitCode=0 Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.152421 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerDied","Data":"d1c27c59c154ebaf4ad111c1373e5c3058460fccf23da3ec2c360c41e5bcc554"} Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.203595 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v"] Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.204917 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.219276 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v"] Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.335463 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.335501 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.335554 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzl7\" (UniqueName: \"kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.437536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.437602 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.437689 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzl7\" (UniqueName: \"kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.437997 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.438567 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.456199 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzl7\" (UniqueName: \"kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.518994 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:37 crc kubenswrapper[4759]: I1205 00:36:37.727370 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v"] Dec 05 00:36:37 crc kubenswrapper[4759]: W1205 00:36:37.731915 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5126d26d_8fe3_473d_bd52_52709d0fbb37.slice/crio-28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8 WatchSource:0}: Error finding container 28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8: Status 404 returned error can't find the container with id 28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8 Dec 05 00:36:38 crc kubenswrapper[4759]: I1205 00:36:38.159719 4759 generic.go:334] "Generic (PLEG): container finished" podID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerID="506ef74212c26b66fba6403934c881a7c3ef1401a108fd7feb14c9b0094d8b4c" exitCode=0 Dec 05 00:36:38 crc kubenswrapper[4759]: I1205 00:36:38.159887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" event={"ID":"5126d26d-8fe3-473d-bd52-52709d0fbb37","Type":"ContainerDied","Data":"506ef74212c26b66fba6403934c881a7c3ef1401a108fd7feb14c9b0094d8b4c"} Dec 05 00:36:38 crc kubenswrapper[4759]: I1205 00:36:38.162188 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" event={"ID":"5126d26d-8fe3-473d-bd52-52709d0fbb37","Type":"ContainerStarted","Data":"28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8"} Dec 05 00:36:38 crc kubenswrapper[4759]: I1205 00:36:38.165291 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerStarted","Data":"94803a3bcef98c60e40733f389f4909833722d743c1b9b068074c28287a5b69f"} Dec 05 00:36:38 crc kubenswrapper[4759]: I1205 00:36:38.193722 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hbzz6" podStartSLOduration=2.391831414 podStartE2EDuration="5.193706038s" podCreationTimestamp="2025-12-05 00:36:33 +0000 UTC" firstStartedPulling="2025-12-05 00:36:35.136117337 +0000 UTC m=+814.351778287" lastFinishedPulling="2025-12-05 00:36:37.937991961 +0000 UTC m=+817.153652911" observedRunningTime="2025-12-05 00:36:38.18938232 +0000 UTC m=+817.405043270" watchObservedRunningTime="2025-12-05 00:36:38.193706038 +0000 UTC m=+817.409366988" Dec 05 00:36:40 crc kubenswrapper[4759]: I1205 00:36:40.178496 4759 generic.go:334] "Generic (PLEG): container finished" podID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerID="207e712d9f4421ec7425c7c73046563a4310b3c92bc7059582033c60ae94bbd7" exitCode=0 Dec 05 00:36:40 crc kubenswrapper[4759]: I1205 00:36:40.178765 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" event={"ID":"5126d26d-8fe3-473d-bd52-52709d0fbb37","Type":"ContainerDied","Data":"207e712d9f4421ec7425c7c73046563a4310b3c92bc7059582033c60ae94bbd7"} Dec 05 00:36:41 crc kubenswrapper[4759]: I1205 00:36:41.185872 4759 generic.go:334] "Generic (PLEG): container finished" podID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerID="f69fe5a1bcd2cb465839b26c5ce38680fc4d852f86f7c18142c5c30d14b8e7a0" exitCode=0 Dec 05 00:36:41 crc kubenswrapper[4759]: I1205 00:36:41.185911 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" event={"ID":"5126d26d-8fe3-473d-bd52-52709d0fbb37","Type":"ContainerDied","Data":"f69fe5a1bcd2cb465839b26c5ce38680fc4d852f86f7c18142c5c30d14b8e7a0"} Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.448783 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.616467 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzl7\" (UniqueName: \"kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7\") pod \"5126d26d-8fe3-473d-bd52-52709d0fbb37\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.616679 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util\") pod \"5126d26d-8fe3-473d-bd52-52709d0fbb37\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.616716 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle\") pod \"5126d26d-8fe3-473d-bd52-52709d0fbb37\" (UID: \"5126d26d-8fe3-473d-bd52-52709d0fbb37\") " Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.617610 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle" (OuterVolumeSpecName: "bundle") pod "5126d26d-8fe3-473d-bd52-52709d0fbb37" (UID: "5126d26d-8fe3-473d-bd52-52709d0fbb37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.629365 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util" (OuterVolumeSpecName: "util") pod "5126d26d-8fe3-473d-bd52-52709d0fbb37" (UID: "5126d26d-8fe3-473d-bd52-52709d0fbb37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.634593 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7" (OuterVolumeSpecName: "kube-api-access-vfzl7") pod "5126d26d-8fe3-473d-bd52-52709d0fbb37" (UID: "5126d26d-8fe3-473d-bd52-52709d0fbb37"). InnerVolumeSpecName "kube-api-access-vfzl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.717885 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.717935 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5126d26d-8fe3-473d-bd52-52709d0fbb37-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:42 crc kubenswrapper[4759]: I1205 00:36:42.717954 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzl7\" (UniqueName: \"kubernetes.io/projected/5126d26d-8fe3-473d-bd52-52709d0fbb37-kube-api-access-vfzl7\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:43 crc kubenswrapper[4759]: I1205 00:36:43.201631 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" event={"ID":"5126d26d-8fe3-473d-bd52-52709d0fbb37","Type":"ContainerDied","Data":"28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8"} Dec 05 00:36:43 crc kubenswrapper[4759]: I1205 00:36:43.202042 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28202a2456735b95fc62d3e4dac0de2470d5884a727c9ccb78a3383583da1fc8" Dec 05 00:36:43 crc kubenswrapper[4759]: I1205 00:36:43.201706 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v" Dec 05 00:36:44 crc kubenswrapper[4759]: I1205 00:36:44.104049 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:44 crc kubenswrapper[4759]: I1205 00:36:44.104122 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:44 crc kubenswrapper[4759]: I1205 00:36:44.179521 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:44 crc kubenswrapper[4759]: I1205 00:36:44.251581 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:46 crc kubenswrapper[4759]: I1205 00:36:46.163811 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:46 crc kubenswrapper[4759]: I1205 00:36:46.218773 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hbzz6" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="registry-server" containerID="cri-o://94803a3bcef98c60e40733f389f4909833722d743c1b9b068074c28287a5b69f" gracePeriod=2 Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.232596 4759 generic.go:334] "Generic (PLEG): container finished" podID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerID="94803a3bcef98c60e40733f389f4909833722d743c1b9b068074c28287a5b69f" exitCode=0 Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.232645 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerDied","Data":"94803a3bcef98c60e40733f389f4909833722d743c1b9b068074c28287a5b69f"} Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406106 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-j6x8h"] Dec 05 00:36:48 crc kubenswrapper[4759]: E1205 00:36:48.406355 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="util" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406369 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="util" Dec 05 00:36:48 crc kubenswrapper[4759]: E1205 00:36:48.406381 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="pull" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406388 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="pull" Dec 05 00:36:48 crc kubenswrapper[4759]: E1205 00:36:48.406404 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="extract" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406410 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="extract" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406503 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5126d26d-8fe3-473d-bd52-52709d0fbb37" containerName="extract" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.406891 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.408876 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.412045 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.412083 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-fk44z" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.415792 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-j6x8h"] Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.515372 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsz2s\" (UniqueName: \"kubernetes.io/projected/bd82eb36-66d9-4938-a5ab-29c36b1f482e-kube-api-access-jsz2s\") pod \"cluster-logging-operator-ff9846bd-j6x8h\" (UID: \"bd82eb36-66d9-4938-a5ab-29c36b1f482e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.616804 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsz2s\" (UniqueName: \"kubernetes.io/projected/bd82eb36-66d9-4938-a5ab-29c36b1f482e-kube-api-access-jsz2s\") pod \"cluster-logging-operator-ff9846bd-j6x8h\" (UID: \"bd82eb36-66d9-4938-a5ab-29c36b1f482e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.637949 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsz2s\" (UniqueName: \"kubernetes.io/projected/bd82eb36-66d9-4938-a5ab-29c36b1f482e-kube-api-access-jsz2s\") pod \"cluster-logging-operator-ff9846bd-j6x8h\" (UID: \"bd82eb36-66d9-4938-a5ab-29c36b1f482e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.722118 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" Dec 05 00:36:48 crc kubenswrapper[4759]: I1205 00:36:48.968029 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.021828 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content\") pod \"7c42cc17-9e46-4598-84fe-702c1ade765b\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.021982 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flttd\" (UniqueName: \"kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd\") pod \"7c42cc17-9e46-4598-84fe-702c1ade765b\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.022011 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities\") pod \"7c42cc17-9e46-4598-84fe-702c1ade765b\" (UID: \"7c42cc17-9e46-4598-84fe-702c1ade765b\") " Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.023447 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities" (OuterVolumeSpecName: "utilities") pod "7c42cc17-9e46-4598-84fe-702c1ade765b" (UID: "7c42cc17-9e46-4598-84fe-702c1ade765b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.027690 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd" (OuterVolumeSpecName: "kube-api-access-flttd") pod "7c42cc17-9e46-4598-84fe-702c1ade765b" (UID: "7c42cc17-9e46-4598-84fe-702c1ade765b"). InnerVolumeSpecName "kube-api-access-flttd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.122939 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flttd\" (UniqueName: \"kubernetes.io/projected/7c42cc17-9e46-4598-84fe-702c1ade765b-kube-api-access-flttd\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.123164 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.137540 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c42cc17-9e46-4598-84fe-702c1ade765b" (UID: "7c42cc17-9e46-4598-84fe-702c1ade765b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.201536 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-j6x8h"] Dec 05 00:36:49 crc kubenswrapper[4759]: W1205 00:36:49.205623 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd82eb36_66d9_4938_a5ab_29c36b1f482e.slice/crio-c13ad912b1dfdfe367a05e495091f08ffdaebbd3e5173f02bc5591f7cd1f92b5 WatchSource:0}: Error finding container c13ad912b1dfdfe367a05e495091f08ffdaebbd3e5173f02bc5591f7cd1f92b5: Status 404 returned error can't find the container with id c13ad912b1dfdfe367a05e495091f08ffdaebbd3e5173f02bc5591f7cd1f92b5 Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.224560 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c42cc17-9e46-4598-84fe-702c1ade765b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.240576 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" event={"ID":"bd82eb36-66d9-4938-a5ab-29c36b1f482e","Type":"ContainerStarted","Data":"c13ad912b1dfdfe367a05e495091f08ffdaebbd3e5173f02bc5591f7cd1f92b5"} Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.242887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbzz6" event={"ID":"7c42cc17-9e46-4598-84fe-702c1ade765b","Type":"ContainerDied","Data":"8c7fc7ad7b84c6d1989918ed006e0b04e005239fed8cd9381b3b2f139dbbf494"} Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.242937 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbzz6" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.242948 4759 scope.go:117] "RemoveContainer" containerID="94803a3bcef98c60e40733f389f4909833722d743c1b9b068074c28287a5b69f" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.259425 4759 scope.go:117] "RemoveContainer" containerID="d1c27c59c154ebaf4ad111c1373e5c3058460fccf23da3ec2c360c41e5bcc554" Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.264482 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.268617 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hbzz6"] Dec 05 00:36:49 crc kubenswrapper[4759]: I1205 00:36:49.273209 4759 scope.go:117] "RemoveContainer" containerID="6e73cc93c7e65b092f30d2f4025d1d0206349d3c34a339025f2a1e8f178e94f0" Dec 05 00:36:51 crc kubenswrapper[4759]: I1205 00:36:51.163111 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" path="/var/lib/kubelet/pods/7c42cc17-9e46-4598-84fe-702c1ade765b/volumes" Dec 05 00:36:54 crc kubenswrapper[4759]: I1205 00:36:54.279447 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" event={"ID":"bd82eb36-66d9-4938-a5ab-29c36b1f482e","Type":"ContainerStarted","Data":"c066df95b1a2316abd8a2a27f62502bc078d0001fc0227b8f850e665fe83cad5"} Dec 05 00:36:54 crc kubenswrapper[4759]: I1205 00:36:54.299482 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-j6x8h" podStartSLOduration=1.548339178 podStartE2EDuration="6.299463129s" podCreationTimestamp="2025-12-05 00:36:48 +0000 UTC" firstStartedPulling="2025-12-05 00:36:49.208399202 +0000 UTC m=+828.424060152" lastFinishedPulling="2025-12-05 00:36:53.959523153 +0000 UTC m=+833.175184103" observedRunningTime="2025-12-05 00:36:54.295665376 +0000 UTC m=+833.511326326" watchObservedRunningTime="2025-12-05 00:36:54.299463129 +0000 UTC m=+833.515124089" Dec 05 00:36:56 crc kubenswrapper[4759]: I1205 00:36:56.293339 4759 generic.go:334] "Generic (PLEG): container finished" podID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerID="e2b407c4e82a65f2cbb01362956e2b311969b18e37e2e22b5cd43dce041c1f07" exitCode=0 Dec 05 00:36:56 crc kubenswrapper[4759]: I1205 00:36:56.293407 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" event={"ID":"122b71c8-7720-4f98-b9b6-e68ca39986bf","Type":"ContainerDied","Data":"e2b407c4e82a65f2cbb01362956e2b311969b18e37e2e22b5cd43dce041c1f07"} Dec 05 00:36:57 crc kubenswrapper[4759]: I1205 00:36:57.303859 4759 generic.go:334] "Generic (PLEG): container finished" podID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerID="b2595e730e5aae5cafb8632ebbb636575846b52534c5159e515b6bf90ddfffc6" exitCode=0 Dec 05 00:36:57 crc kubenswrapper[4759]: I1205 00:36:57.303920 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" event={"ID":"122b71c8-7720-4f98-b9b6-e68ca39986bf","Type":"ContainerDied","Data":"b2595e730e5aae5cafb8632ebbb636575846b52534c5159e515b6bf90ddfffc6"} Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.573782 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.650492 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util\") pod \"122b71c8-7720-4f98-b9b6-e68ca39986bf\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.650587 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle\") pod \"122b71c8-7720-4f98-b9b6-e68ca39986bf\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.650620 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqvcm\" (UniqueName: \"kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm\") pod \"122b71c8-7720-4f98-b9b6-e68ca39986bf\" (UID: \"122b71c8-7720-4f98-b9b6-e68ca39986bf\") " Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.652876 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle" (OuterVolumeSpecName: "bundle") pod "122b71c8-7720-4f98-b9b6-e68ca39986bf" (UID: "122b71c8-7720-4f98-b9b6-e68ca39986bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.655802 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm" (OuterVolumeSpecName: "kube-api-access-fqvcm") pod "122b71c8-7720-4f98-b9b6-e68ca39986bf" (UID: "122b71c8-7720-4f98-b9b6-e68ca39986bf"). InnerVolumeSpecName "kube-api-access-fqvcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.675933 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util" (OuterVolumeSpecName: "util") pod "122b71c8-7720-4f98-b9b6-e68ca39986bf" (UID: "122b71c8-7720-4f98-b9b6-e68ca39986bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.751401 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.751629 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/122b71c8-7720-4f98-b9b6-e68ca39986bf-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:58 crc kubenswrapper[4759]: I1205 00:36:58.751705 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqvcm\" (UniqueName: \"kubernetes.io/projected/122b71c8-7720-4f98-b9b6-e68ca39986bf-kube-api-access-fqvcm\") on node \"crc\" DevicePath \"\"" Dec 05 00:36:59 crc kubenswrapper[4759]: I1205 00:36:59.326121 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" event={"ID":"122b71c8-7720-4f98-b9b6-e68ca39986bf","Type":"ContainerDied","Data":"5c4cbc3c01da2ce85f86443719b0aa910d0171253e97c66e93eecd5bc6570d99"} Dec 05 00:36:59 crc kubenswrapper[4759]: I1205 00:36:59.326164 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4cbc3c01da2ce85f86443719b0aa910d0171253e97c66e93eecd5bc6570d99" Dec 05 00:36:59 crc kubenswrapper[4759]: I1205 00:36:59.326224 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826021 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz"] Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826721 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="extract-utilities" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826733 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="extract-utilities" Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826741 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="extract-content" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826746 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="extract-content" Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826758 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="util" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826764 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="util" Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826772 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="registry-server" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826778 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="registry-server" Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826790 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="pull" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826796 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="pull" Dec 05 00:37:05 crc kubenswrapper[4759]: E1205 00:37:05.826803 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="extract" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826810 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="extract" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826903 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="122b71c8-7720-4f98-b9b6-e68ca39986bf" containerName="extract" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.826914 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c42cc17-9e46-4598-84fe-702c1ade765b" containerName="registry-server" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.827508 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.830359 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.830357 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-fv6vj" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.830521 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.830718 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.833264 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.834191 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.846202 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz"] Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.980207 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncbh\" (UniqueName: \"kubernetes.io/projected/7c292474-9687-4ff3-a1c3-4dffe9594a36-kube-api-access-wncbh\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.980283 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.980343 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-apiservice-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.980417 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7c292474-9687-4ff3-a1c3-4dffe9594a36-manager-config\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:05 crc kubenswrapper[4759]: I1205 00:37:05.980454 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-webhook-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.081373 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncbh\" (UniqueName: \"kubernetes.io/projected/7c292474-9687-4ff3-a1c3-4dffe9594a36-kube-api-access-wncbh\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.081451 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.081481 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-apiservice-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.081520 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7c292474-9687-4ff3-a1c3-4dffe9594a36-manager-config\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.081557 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-webhook-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.083071 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7c292474-9687-4ff3-a1c3-4dffe9594a36-manager-config\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.087003 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-apiservice-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.087097 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.087848 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c292474-9687-4ff3-a1c3-4dffe9594a36-webhook-cert\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.108601 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncbh\" (UniqueName: \"kubernetes.io/projected/7c292474-9687-4ff3-a1c3-4dffe9594a36-kube-api-access-wncbh\") pod \"loki-operator-controller-manager-55cfc66bc8-kk6tz\" (UID: \"7c292474-9687-4ff3-a1c3-4dffe9594a36\") " pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.142365 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:06 crc kubenswrapper[4759]: I1205 00:37:06.598550 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz"] Dec 05 00:37:06 crc kubenswrapper[4759]: W1205 00:37:06.602951 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c292474_9687_4ff3_a1c3_4dffe9594a36.slice/crio-0b1fd2ff3cb9850bbbec8dbaeb78064aca0cf1d68eddc63914e7acc074c85bbb WatchSource:0}: Error finding container 0b1fd2ff3cb9850bbbec8dbaeb78064aca0cf1d68eddc63914e7acc074c85bbb: Status 404 returned error can't find the container with id 0b1fd2ff3cb9850bbbec8dbaeb78064aca0cf1d68eddc63914e7acc074c85bbb Dec 05 00:37:07 crc kubenswrapper[4759]: I1205 00:37:07.377176 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" event={"ID":"7c292474-9687-4ff3-a1c3-4dffe9594a36","Type":"ContainerStarted","Data":"0b1fd2ff3cb9850bbbec8dbaeb78064aca0cf1d68eddc63914e7acc074c85bbb"} Dec 05 00:37:09 crc kubenswrapper[4759]: I1205 00:37:09.423587 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" event={"ID":"7c292474-9687-4ff3-a1c3-4dffe9594a36","Type":"ContainerStarted","Data":"a5de5fa5bc9602e70efad1def0527b722d7422522117f62e272856314e031b5a"} Dec 05 00:37:15 crc kubenswrapper[4759]: I1205 00:37:15.460711 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" event={"ID":"7c292474-9687-4ff3-a1c3-4dffe9594a36","Type":"ContainerStarted","Data":"7c91300c7f9112302ee59a2ce435a1dde95d8bd545d848525de199ad04ebab7d"} Dec 05 00:37:15 crc kubenswrapper[4759]: I1205 00:37:15.462800 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:15 crc kubenswrapper[4759]: I1205 00:37:15.463029 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" Dec 05 00:37:15 crc kubenswrapper[4759]: I1205 00:37:15.485289 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-55cfc66bc8-kk6tz" podStartSLOduration=2.078493104 podStartE2EDuration="10.485273761s" podCreationTimestamp="2025-12-05 00:37:05 +0000 UTC" firstStartedPulling="2025-12-05 00:37:06.606712799 +0000 UTC m=+845.822373749" lastFinishedPulling="2025-12-05 00:37:15.013493446 +0000 UTC m=+854.229154406" observedRunningTime="2025-12-05 00:37:15.48483193 +0000 UTC m=+854.700492880" watchObservedRunningTime="2025-12-05 00:37:15.485273761 +0000 UTC m=+854.700934711" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.764606 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.766343 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.771607 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.773387 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.776472 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.886616 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbd7t\" (UniqueName: \"kubernetes.io/projected/54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4-kube-api-access-rbd7t\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.886693 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-79824628-79f3-423a-a960-247a710a7f53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-79824628-79f3-423a-a960-247a710a7f53\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.987431 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbd7t\" (UniqueName: \"kubernetes.io/projected/54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4-kube-api-access-rbd7t\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.987515 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-79824628-79f3-423a-a960-247a710a7f53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-79824628-79f3-423a-a960-247a710a7f53\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.990813 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:37:20 crc kubenswrapper[4759]: I1205 00:37:20.990848 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-79824628-79f3-423a-a960-247a710a7f53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-79824628-79f3-423a-a960-247a710a7f53\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b6d265c326d9619d6f3aa17ac43d2d63e94fa048918c28672f8f5e05b808ed3f/globalmount\"" pod="minio-dev/minio" Dec 05 00:37:21 crc kubenswrapper[4759]: I1205 00:37:21.025001 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbd7t\" (UniqueName: \"kubernetes.io/projected/54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4-kube-api-access-rbd7t\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:21 crc kubenswrapper[4759]: I1205 00:37:21.027528 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-79824628-79f3-423a-a960-247a710a7f53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-79824628-79f3-423a-a960-247a710a7f53\") pod \"minio\" (UID: \"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4\") " pod="minio-dev/minio" Dec 05 00:37:21 crc kubenswrapper[4759]: I1205 00:37:21.090425 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 05 00:37:21 crc kubenswrapper[4759]: I1205 00:37:21.563843 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 05 00:37:22 crc kubenswrapper[4759]: I1205 00:37:22.501168 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4","Type":"ContainerStarted","Data":"15304bcfa5eb283cb4682bd4b4ef429c277ce779d345a7a81ce137fec692c9ad"} Dec 05 00:37:25 crc kubenswrapper[4759]: I1205 00:37:25.530984 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"54d692b6-b6ae-46ca-afb8-8d09d1f5f0c4","Type":"ContainerStarted","Data":"517b7d26889563f1d07b196280abc53146e9ffa9994c0789f2c8aad0eca8b0af"} Dec 05 00:37:25 crc kubenswrapper[4759]: I1205 00:37:25.551567 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.059623948 podStartE2EDuration="7.551547334s" podCreationTimestamp="2025-12-05 00:37:18 +0000 UTC" firstStartedPulling="2025-12-05 00:37:21.57762072 +0000 UTC m=+860.793281710" lastFinishedPulling="2025-12-05 00:37:25.069544136 +0000 UTC m=+864.285205096" observedRunningTime="2025-12-05 00:37:25.549671208 +0000 UTC m=+864.765332198" watchObservedRunningTime="2025-12-05 00:37:25.551547334 +0000 UTC m=+864.767208294" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.958578 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb"] Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.960149 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.962730 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.963124 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.963464 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-dhqgk" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.963705 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.964922 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 05 00:37:29 crc kubenswrapper[4759]: I1205 00:37:29.974803 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.001223 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.001291 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.001369 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-config\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.001405 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.001439 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpml\" (UniqueName: \"kubernetes.io/projected/ec954d4c-6908-403f-8241-87a5191ddd17-kube-api-access-wvpml\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.102897 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.102968 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.103008 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-config\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.103040 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.103079 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpml\" (UniqueName: \"kubernetes.io/projected/ec954d4c-6908-403f-8241-87a5191ddd17-kube-api-access-wvpml\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.103994 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.104455 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-plh5s"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.104809 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec954d4c-6908-403f-8241-87a5191ddd17-config\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.105366 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.111834 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.112054 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.112640 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.132122 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-plh5s"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.137120 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.138800 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/ec954d4c-6908-403f-8241-87a5191ddd17-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.145707 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpml\" (UniqueName: \"kubernetes.io/projected/ec954d4c-6908-403f-8241-87a5191ddd17-kube-api-access-wvpml\") pod \"logging-loki-distributor-76cc67bf56-sfxqb\" (UID: \"ec954d4c-6908-403f-8241-87a5191ddd17\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.198004 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.199053 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.203892 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.203934 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.203988 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxgp\" (UniqueName: \"kubernetes.io/projected/a02b1847-e805-40e3-bbfb-0585e864e6d0-kube-api-access-dzxgp\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.204019 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.204065 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.204085 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-config\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.208880 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.209112 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.268591 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.277283 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.304830 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.304896 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.304926 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-config\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.304956 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.304995 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305021 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305056 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9bz\" (UniqueName: \"kubernetes.io/projected/f45b1eaf-54f2-400d-996e-95fbaff73750-kube-api-access-8c9bz\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305082 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305118 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxgp\" (UniqueName: \"kubernetes.io/projected/a02b1847-e805-40e3-bbfb-0585e864e6d0-kube-api-access-dzxgp\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305164 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-config\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.305192 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.306137 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.307380 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02b1847-e805-40e3-bbfb-0585e864e6d0-config\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.315726 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.316968 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.317977 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a02b1847-e805-40e3-bbfb-0585e864e6d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.341370 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxgp\" (UniqueName: \"kubernetes.io/projected/a02b1847-e805-40e3-bbfb-0585e864e6d0-kube-api-access-dzxgp\") pod \"logging-loki-querier-5895d59bb8-plh5s\" (UID: \"a02b1847-e805-40e3-bbfb-0585e864e6d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.405915 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.405967 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.406000 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9bz\" (UniqueName: \"kubernetes.io/projected/f45b1eaf-54f2-400d-996e-95fbaff73750-kube-api-access-8c9bz\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.406025 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.406068 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-config\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.407102 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-config\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.407291 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.409501 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.409502 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/f45b1eaf-54f2-400d-996e-95fbaff73750-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.425102 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9bz\" (UniqueName: \"kubernetes.io/projected/f45b1eaf-54f2-400d-996e-95fbaff73750-kube-api-access-8c9bz\") pod \"logging-loki-query-frontend-84558f7c9f-62g2t\" (UID: \"f45b1eaf-54f2-400d-996e-95fbaff73750\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.442266 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-gczvw"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.443097 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.446529 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.447039 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-spv5j" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.447275 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.448010 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.448152 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.451210 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.481873 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.502277 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-cf7qk"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510015 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510082 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510106 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510184 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tls-secret\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pzmt\" (UniqueName: \"kubernetes.io/projected/26e7c80b-666f-472c-8fb4-d3349c69227e-kube-api-access-4pzmt\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510328 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tenants\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.510382 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-rbac\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.512244 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.520996 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.521300 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-cf7qk"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.546871 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-gczvw"] Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.620901 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2qv\" (UniqueName: \"kubernetes.io/projected/e2fb0fbf-7c9c-4671-af48-6217b781c53d-kube-api-access-zv2qv\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.620955 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-rbac\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.620983 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-rbac\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.621018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tls-secret\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.621483 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.621551 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.621861 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-rbac\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622204 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tenants\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622248 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622634 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622663 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622705 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tls-secret\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622763 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622790 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622814 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622831 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pzmt\" (UniqueName: \"kubernetes.io/projected/26e7c80b-666f-472c-8fb4-d3349c69227e-kube-api-access-4pzmt\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.622873 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tenants\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.624712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.625132 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.626456 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.626895 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tenants\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.628134 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-tls-secret\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.629180 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/26e7c80b-666f-472c-8fb4-d3349c69227e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.645956 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pzmt\" (UniqueName: \"kubernetes.io/projected/26e7c80b-666f-472c-8fb4-d3349c69227e-kube-api-access-4pzmt\") pod \"logging-loki-gateway-99665cbbf-gczvw\" (UID: \"26e7c80b-666f-472c-8fb4-d3349c69227e\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.650183 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb"] Dec 05 00:37:30 crc kubenswrapper[4759]: W1205 00:37:30.657142 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec954d4c_6908_403f_8241_87a5191ddd17.slice/crio-51c250bc4d41786e7e46a532f580df9f023b16cfba5b4c0d7be1a08e43678b12 WatchSource:0}: Error finding container 51c250bc4d41786e7e46a532f580df9f023b16cfba5b4c0d7be1a08e43678b12: Status 404 returned error can't find the container with id 51c250bc4d41786e7e46a532f580df9f023b16cfba5b4c0d7be1a08e43678b12 Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726815 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tls-secret\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726863 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726887 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tenants\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726906 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726951 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726966 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.726996 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2qv\" (UniqueName: \"kubernetes.io/projected/e2fb0fbf-7c9c-4671-af48-6217b781c53d-kube-api-access-zv2qv\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.727021 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-rbac\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.727882 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-rbac\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.729165 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.730512 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-lokistack-gateway\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.731099 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.731353 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tls-secret\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.732780 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.734472 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e2fb0fbf-7c9c-4671-af48-6217b781c53d-tenants\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.760938 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.769614 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2qv\" (UniqueName: \"kubernetes.io/projected/e2fb0fbf-7c9c-4671-af48-6217b781c53d-kube-api-access-zv2qv\") pod \"logging-loki-gateway-99665cbbf-cf7qk\" (UID: \"e2fb0fbf-7c9c-4671-af48-6217b781c53d\") " pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.852059 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:30 crc kubenswrapper[4759]: I1205 00:37:30.935664 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-plh5s"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.001670 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t"] Dec 05 00:37:31 crc kubenswrapper[4759]: W1205 00:37:31.006088 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf45b1eaf_54f2_400d_996e_95fbaff73750.slice/crio-c86224ead99153f45431fe32274c9b69ed48392950c7840989803dad641c1564 WatchSource:0}: Error finding container c86224ead99153f45431fe32274c9b69ed48392950c7840989803dad641c1564: Status 404 returned error can't find the container with id c86224ead99153f45431fe32274c9b69ed48392950c7840989803dad641c1564 Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.081976 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-cf7qk"] Dec 05 00:37:31 crc kubenswrapper[4759]: W1205 00:37:31.087834 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2fb0fbf_7c9c_4671_af48_6217b781c53d.slice/crio-6a1c9e4071d5fd19e07557c42ec2c8847bfc8480c2eb1d58f29d003354641ae5 WatchSource:0}: Error finding container 6a1c9e4071d5fd19e07557c42ec2c8847bfc8480c2eb1d58f29d003354641ae5: Status 404 returned error can't find the container with id 6a1c9e4071d5fd19e07557c42ec2c8847bfc8480c2eb1d58f29d003354641ae5 Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.146989 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.149704 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.152024 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.152038 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.164836 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.194739 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.197253 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.200055 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.200414 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.222431 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.228794 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-99665cbbf-gczvw"] Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.234975 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235053 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235091 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235134 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-config\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235153 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235184 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxlwt\" (UniqueName: \"kubernetes.io/projected/d4684c5d-5bd7-4500-8f88-1778f47325c3-kube-api-access-lxlwt\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.235209 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336669 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336735 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xk8\" (UniqueName: \"kubernetes.io/projected/e0a9677f-60fe-4bcf-8262-250684b96537-kube-api-access-47xk8\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336767 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336805 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336862 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336923 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-config\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336937 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336955 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-010203a6-a752-46a8-8525-d0e79015f879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-010203a6-a752-46a8-8525-d0e79015f879\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.336994 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337016 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-config\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337049 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337078 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxlwt\" (UniqueName: \"kubernetes.io/projected/d4684c5d-5bd7-4500-8f88-1778f47325c3-kube-api-access-lxlwt\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337100 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337117 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.337813 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.338537 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4684c5d-5bd7-4500-8f88-1778f47325c3-config\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.340845 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.340914 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1277bb91a6c014b0da731ce6d342e2e91910e474973431b0281c6b7c865b43e5/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.343859 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.347872 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.348964 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d4684c5d-5bd7-4500-8f88-1778f47325c3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.354028 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxlwt\" (UniqueName: \"kubernetes.io/projected/d4684c5d-5bd7-4500-8f88-1778f47325c3-kube-api-access-lxlwt\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.363033 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8b7e9f8d-f02a-4950-a776-cf96ad18dd23\") pod \"logging-loki-compactor-0\" (UID: \"d4684c5d-5bd7-4500-8f88-1778f47325c3\") " pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438167 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438325 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438364 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-010203a6-a752-46a8-8525-d0e79015f879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-010203a6-a752-46a8-8525-d0e79015f879\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438389 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438443 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-config\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438460 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438495 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.438564 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xk8\" (UniqueName: \"kubernetes.io/projected/e0a9677f-60fe-4bcf-8262-250684b96537-kube-api-access-47xk8\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.440178 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-config\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.440393 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.443136 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.443927 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.445121 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.445158 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/906c49d962c145227a0c9afdb046df0336390d22bb9f0f32d01c7eb2b412a588/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.445259 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.445294 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-010203a6-a752-46a8-8525-d0e79015f879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-010203a6-a752-46a8-8525-d0e79015f879\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce37e9a940be95dbd1fcae4c17c3d19304f5f7d17e98bc2ab81c5e02f17dfb43/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.450676 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0a9677f-60fe-4bcf-8262-250684b96537-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.458657 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xk8\" (UniqueName: \"kubernetes.io/projected/e0a9677f-60fe-4bcf-8262-250684b96537-kube-api-access-47xk8\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.468745 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-010203a6-a752-46a8-8525-d0e79015f879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-010203a6-a752-46a8-8525-d0e79015f879\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.470379 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f20ca98-94d2-471b-81a8-9c4a52e6854c\") pod \"logging-loki-ingester-0\" (UID: \"e0a9677f-60fe-4bcf-8262-250684b96537\") " pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.485808 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.515870 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.581561 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" event={"ID":"ec954d4c-6908-403f-8241-87a5191ddd17","Type":"ContainerStarted","Data":"51c250bc4d41786e7e46a532f580df9f023b16cfba5b4c0d7be1a08e43678b12"} Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.582331 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" event={"ID":"e2fb0fbf-7c9c-4671-af48-6217b781c53d","Type":"ContainerStarted","Data":"6a1c9e4071d5fd19e07557c42ec2c8847bfc8480c2eb1d58f29d003354641ae5"} Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.583393 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" event={"ID":"a02b1847-e805-40e3-bbfb-0585e864e6d0","Type":"ContainerStarted","Data":"bef55f67bdafca468ea12a6cedfaff8ee61504e0f3d5fffa34f63a2ba92efbb5"} Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.584132 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" event={"ID":"26e7c80b-666f-472c-8fb4-d3349c69227e","Type":"ContainerStarted","Data":"bd0543e9bf6e4fdc6e937d2fd0ef680fa9eb0b67181ec97d2461765b9d2b917c"} Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.584809 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" event={"ID":"f45b1eaf-54f2-400d-996e-95fbaff73750","Type":"ContainerStarted","Data":"c86224ead99153f45431fe32274c9b69ed48392950c7840989803dad641c1564"} Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.722838 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: W1205 00:37:31.732509 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a9677f_60fe_4bcf_8262_250684b96537.slice/crio-473ccb70e9371ce35ec06373b7d9aaa4b85ee626d398f968c421a85a3ba61f81 WatchSource:0}: Error finding container 473ccb70e9371ce35ec06373b7d9aaa4b85ee626d398f968c421a85a3ba61f81: Status 404 returned error can't find the container with id 473ccb70e9371ce35ec06373b7d9aaa4b85ee626d398f968c421a85a3ba61f81 Dec 05 00:37:31 crc kubenswrapper[4759]: I1205 00:37:31.984420 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 05 00:37:31 crc kubenswrapper[4759]: W1205 00:37:31.992249 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4684c5d_5bd7_4500_8f88_1778f47325c3.slice/crio-1fe5538f1406be51dd0b579260dbc51d94db179af1b5d661d406fdb6859fb6b9 WatchSource:0}: Error finding container 1fe5538f1406be51dd0b579260dbc51d94db179af1b5d661d406fdb6859fb6b9: Status 404 returned error can't find the container with id 1fe5538f1406be51dd0b579260dbc51d94db179af1b5d661d406fdb6859fb6b9 Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.595778 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"e0a9677f-60fe-4bcf-8262-250684b96537","Type":"ContainerStarted","Data":"473ccb70e9371ce35ec06373b7d9aaa4b85ee626d398f968c421a85a3ba61f81"} Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.597934 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"d4684c5d-5bd7-4500-8f88-1778f47325c3","Type":"ContainerStarted","Data":"1fe5538f1406be51dd0b579260dbc51d94db179af1b5d661d406fdb6859fb6b9"} Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.776817 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.777820 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.780809 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.781010 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.791557 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967024 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967085 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967127 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2jd\" (UniqueName: \"kubernetes.io/projected/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-kube-api-access-tl2jd\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967157 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967195 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967218 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:32 crc kubenswrapper[4759]: I1205 00:37:32.967243 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068658 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068714 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068746 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068774 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068806 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2jd\" (UniqueName: \"kubernetes.io/projected/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-kube-api-access-tl2jd\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.068861 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.069480 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.070358 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.072000 4759 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.072037 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a6ceb0ec30db51130bd1513f6d6b102e0db69381679b51fc0cd9b7ccf598910d/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.075558 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.076751 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.091416 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.093430 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2jd\" (UniqueName: \"kubernetes.io/projected/c6186de1-0fbc-4432-8bb4-c95e25efe3a7-kube-api-access-tl2jd\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.093823 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95c6842f-f47d-4a32-af48-a96bbfe14ca0\") pod \"logging-loki-index-gateway-0\" (UID: \"c6186de1-0fbc-4432-8bb4-c95e25efe3a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.119388 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:33 crc kubenswrapper[4759]: I1205 00:37:33.601455 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 05 00:37:34 crc kubenswrapper[4759]: I1205 00:37:34.612395 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"c6186de1-0fbc-4432-8bb4-c95e25efe3a7","Type":"ContainerStarted","Data":"fb1b98ea17b8d7b23c460f4f69647ae940c9c1106fa2e98eb49914a3f09b7d97"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.632771 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"e0a9677f-60fe-4bcf-8262-250684b96537","Type":"ContainerStarted","Data":"6668ebe38bd7b0cd01169b6380c86fd34bb31a3aeaad2a9e0d56ea72da01d6d9"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.633263 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.634930 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" event={"ID":"e2fb0fbf-7c9c-4671-af48-6217b781c53d","Type":"ContainerStarted","Data":"bb26e1d58e42532d806163a746265818f2a1f33e40418084adf05308ac02c63f"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.637240 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" event={"ID":"a02b1847-e805-40e3-bbfb-0585e864e6d0","Type":"ContainerStarted","Data":"23362bcdc29d04cc677ef3cf8d0b17bf83ede0a240670db0652f50ffad7e400e"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.637369 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.638883 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" event={"ID":"26e7c80b-666f-472c-8fb4-d3349c69227e","Type":"ContainerStarted","Data":"7e720cf4e7f2ec2838d0b80d213fcee16d07baa227a3c393988826b0a99c0049"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.641599 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" event={"ID":"f45b1eaf-54f2-400d-996e-95fbaff73750","Type":"ContainerStarted","Data":"2a92bc9bb2216c8fb8a5241aac45fb8e168b2168c10dd1a1b70df5e0bd7fb6c2"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.641734 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.643076 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"d4684c5d-5bd7-4500-8f88-1778f47325c3","Type":"ContainerStarted","Data":"756bbc87730a45840d364bae418b3cd6e46ece74e6c6715ae1b1f9acd75bb5e4"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.643206 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.645190 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" event={"ID":"ec954d4c-6908-403f-8241-87a5191ddd17","Type":"ContainerStarted","Data":"f8f099a7d873d1c30dc4970861855fb0aa781f0fff5ef91ca8144627fc8deb99"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.645382 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.646787 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"c6186de1-0fbc-4432-8bb4-c95e25efe3a7","Type":"ContainerStarted","Data":"9851570dfdd0d9efea28d201bebc1eecb560cb8127c527b5f35cebcbf9621042"} Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.646946 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.676677 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.044768222 podStartE2EDuration="6.676660921s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:31.735470151 +0000 UTC m=+870.951131101" lastFinishedPulling="2025-12-05 00:37:35.36736281 +0000 UTC m=+874.583023800" observedRunningTime="2025-12-05 00:37:36.676137657 +0000 UTC m=+875.891798607" watchObservedRunningTime="2025-12-05 00:37:36.676660921 +0000 UTC m=+875.892321871" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.699132 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" podStartSLOduration=2.341894707 podStartE2EDuration="6.699108926s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:31.010017738 +0000 UTC m=+870.225678698" lastFinishedPulling="2025-12-05 00:37:35.367231957 +0000 UTC m=+874.582892917" observedRunningTime="2025-12-05 00:37:36.697668441 +0000 UTC m=+875.913329391" watchObservedRunningTime="2025-12-05 00:37:36.699108926 +0000 UTC m=+875.914769876" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.724938 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" podStartSLOduration=2.9995893049999998 podStartE2EDuration="7.724912214s" podCreationTimestamp="2025-12-05 00:37:29 +0000 UTC" firstStartedPulling="2025-12-05 00:37:30.660755364 +0000 UTC m=+869.876416314" lastFinishedPulling="2025-12-05 00:37:35.386078243 +0000 UTC m=+874.601739223" observedRunningTime="2025-12-05 00:37:36.723746675 +0000 UTC m=+875.939407645" watchObservedRunningTime="2025-12-05 00:37:36.724912214 +0000 UTC m=+875.940573174" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.748053 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=5.101991502 podStartE2EDuration="6.748027107s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:33.626933589 +0000 UTC m=+872.842594539" lastFinishedPulling="2025-12-05 00:37:35.272969204 +0000 UTC m=+874.488630144" observedRunningTime="2025-12-05 00:37:36.743661759 +0000 UTC m=+875.959322729" watchObservedRunningTime="2025-12-05 00:37:36.748027107 +0000 UTC m=+875.963688057" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.766645 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" podStartSLOduration=2.319557224 podStartE2EDuration="6.766620477s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:30.955231652 +0000 UTC m=+870.170892602" lastFinishedPulling="2025-12-05 00:37:35.402294905 +0000 UTC m=+874.617955855" observedRunningTime="2025-12-05 00:37:36.76151137 +0000 UTC m=+875.977172340" watchObservedRunningTime="2025-12-05 00:37:36.766620477 +0000 UTC m=+875.982281417" Dec 05 00:37:36 crc kubenswrapper[4759]: I1205 00:37:36.780209 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.4321654280000002 podStartE2EDuration="6.780191583s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:31.995147166 +0000 UTC m=+871.210808116" lastFinishedPulling="2025-12-05 00:37:35.343173321 +0000 UTC m=+874.558834271" observedRunningTime="2025-12-05 00:37:36.778841439 +0000 UTC m=+875.994502389" watchObservedRunningTime="2025-12-05 00:37:36.780191583 +0000 UTC m=+875.995852533" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.667168 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" event={"ID":"26e7c80b-666f-472c-8fb4-d3349c69227e","Type":"ContainerStarted","Data":"981034dbbfa2e889a8f9c0bd7974df00b9347311cd3dede731e4395abb0c3c62"} Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.667714 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.667809 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.673157 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" event={"ID":"e2fb0fbf-7c9c-4671-af48-6217b781c53d","Type":"ContainerStarted","Data":"744199d88a259d87ba175b9da9cdc6b919043bb1b7484182c9fb1c7bb9b55dd8"} Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.673564 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.673627 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.681829 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.693875 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.694148 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.697906 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.708090 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-99665cbbf-gczvw" podStartSLOduration=2.1559963890000002 podStartE2EDuration="8.708057703s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:31.242787418 +0000 UTC m=+870.458448358" lastFinishedPulling="2025-12-05 00:37:37.794848712 +0000 UTC m=+877.010509672" observedRunningTime="2025-12-05 00:37:38.701570912 +0000 UTC m=+877.917231902" watchObservedRunningTime="2025-12-05 00:37:38.708057703 +0000 UTC m=+877.923718713" Dec 05 00:37:38 crc kubenswrapper[4759]: I1205 00:37:38.825547 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-99665cbbf-cf7qk" podStartSLOduration=2.107116757 podStartE2EDuration="8.825289613s" podCreationTimestamp="2025-12-05 00:37:30 +0000 UTC" firstStartedPulling="2025-12-05 00:37:31.091416532 +0000 UTC m=+870.307077482" lastFinishedPulling="2025-12-05 00:37:37.809589378 +0000 UTC m=+877.025250338" observedRunningTime="2025-12-05 00:37:38.819420418 +0000 UTC m=+878.035081378" watchObservedRunningTime="2025-12-05 00:37:38.825289613 +0000 UTC m=+878.040950583" Dec 05 00:37:50 crc kubenswrapper[4759]: I1205 00:37:50.288169 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-sfxqb" Dec 05 00:37:50 crc kubenswrapper[4759]: I1205 00:37:50.491582 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-plh5s" Dec 05 00:37:50 crc kubenswrapper[4759]: I1205 00:37:50.539689 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-62g2t" Dec 05 00:37:51 crc kubenswrapper[4759]: I1205 00:37:51.492051 4759 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 05 00:37:51 crc kubenswrapper[4759]: I1205 00:37:51.492114 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0a9677f-60fe-4bcf-8262-250684b96537" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 00:37:51 crc kubenswrapper[4759]: I1205 00:37:51.521220 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 05 00:37:53 crc kubenswrapper[4759]: I1205 00:37:53.127070 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 05 00:38:01 crc kubenswrapper[4759]: I1205 00:38:01.494829 4759 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 05 00:38:01 crc kubenswrapper[4759]: I1205 00:38:01.495572 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0a9677f-60fe-4bcf-8262-250684b96537" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 00:38:04 crc kubenswrapper[4759]: I1205 00:38:04.434108 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:38:04 crc kubenswrapper[4759]: I1205 00:38:04.434494 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:38:11 crc kubenswrapper[4759]: I1205 00:38:11.491596 4759 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 05 00:38:11 crc kubenswrapper[4759]: I1205 00:38:11.492449 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0a9677f-60fe-4bcf-8262-250684b96537" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.605404 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.607898 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.634355 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.753100 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.753809 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.753875 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchpd\" (UniqueName: \"kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.854925 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.854981 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchpd\" (UniqueName: \"kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.855040 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.855600 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.855981 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.877465 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchpd\" (UniqueName: \"kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd\") pod \"community-operators-bgnqw\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:14 crc kubenswrapper[4759]: I1205 00:38:14.955078 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:15 crc kubenswrapper[4759]: I1205 00:38:15.494804 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:15 crc kubenswrapper[4759]: W1205 00:38:15.503694 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd877f548_40fd_440c_9d46_db2b321adc66.slice/crio-851bec1ae0fcf053c43c254c1a1ad4b9a82d01c1420bb66e5933d26f6caa3802 WatchSource:0}: Error finding container 851bec1ae0fcf053c43c254c1a1ad4b9a82d01c1420bb66e5933d26f6caa3802: Status 404 returned error can't find the container with id 851bec1ae0fcf053c43c254c1a1ad4b9a82d01c1420bb66e5933d26f6caa3802 Dec 05 00:38:15 crc kubenswrapper[4759]: I1205 00:38:15.962289 4759 generic.go:334] "Generic (PLEG): container finished" podID="d877f548-40fd-440c-9d46-db2b321adc66" containerID="4d54e741d271c7791cf0c89efdc0870bc50cd137a51a75412b0f0dea60d34dc6" exitCode=0 Dec 05 00:38:15 crc kubenswrapper[4759]: I1205 00:38:15.962586 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerDied","Data":"4d54e741d271c7791cf0c89efdc0870bc50cd137a51a75412b0f0dea60d34dc6"} Dec 05 00:38:15 crc kubenswrapper[4759]: I1205 00:38:15.962702 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerStarted","Data":"851bec1ae0fcf053c43c254c1a1ad4b9a82d01c1420bb66e5933d26f6caa3802"} Dec 05 00:38:16 crc kubenswrapper[4759]: I1205 00:38:16.991148 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerStarted","Data":"034919b9f8e0ee3c2f98ddc23d6ca131286e772b3249c45169aca7166fabdbdb"} Dec 05 00:38:18 crc kubenswrapper[4759]: I1205 00:38:18.002911 4759 generic.go:334] "Generic (PLEG): container finished" podID="d877f548-40fd-440c-9d46-db2b321adc66" containerID="034919b9f8e0ee3c2f98ddc23d6ca131286e772b3249c45169aca7166fabdbdb" exitCode=0 Dec 05 00:38:18 crc kubenswrapper[4759]: I1205 00:38:18.003065 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerDied","Data":"034919b9f8e0ee3c2f98ddc23d6ca131286e772b3249c45169aca7166fabdbdb"} Dec 05 00:38:18 crc kubenswrapper[4759]: I1205 00:38:18.987037 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:18 crc kubenswrapper[4759]: I1205 00:38:18.988354 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.003255 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.012318 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerStarted","Data":"2f7661969e429c9e9211852bec8da06714c8d90074dec6e1d86d544497634fdc"} Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.045203 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bgnqw" podStartSLOduration=2.597348428 podStartE2EDuration="5.045178247s" podCreationTimestamp="2025-12-05 00:38:14 +0000 UTC" firstStartedPulling="2025-12-05 00:38:15.96394889 +0000 UTC m=+915.179609850" lastFinishedPulling="2025-12-05 00:38:18.411778709 +0000 UTC m=+917.627439669" observedRunningTime="2025-12-05 00:38:19.03884013 +0000 UTC m=+918.254501090" watchObservedRunningTime="2025-12-05 00:38:19.045178247 +0000 UTC m=+918.260839197" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.125016 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.125070 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xp9\" (UniqueName: \"kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.125102 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.226226 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.226282 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xp9\" (UniqueName: \"kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.226322 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.226805 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.227160 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.256141 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xp9\" (UniqueName: \"kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9\") pod \"certified-operators-ncspd\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.304818 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:19 crc kubenswrapper[4759]: I1205 00:38:19.754275 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:20 crc kubenswrapper[4759]: I1205 00:38:20.021705 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerID="138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93" exitCode=0 Dec 05 00:38:20 crc kubenswrapper[4759]: I1205 00:38:20.021821 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerDied","Data":"138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93"} Dec 05 00:38:20 crc kubenswrapper[4759]: I1205 00:38:20.021880 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerStarted","Data":"62150f23ee9a04989bf1d1856d20f2bb65adfe331350d3c30bd4f625d3957a77"} Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.036599 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerStarted","Data":"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885"} Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.186674 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.188690 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.204836 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.386118 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.386188 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.386237 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbf55\" (UniqueName: \"kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.487273 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.487355 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbf55\" (UniqueName: \"kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.487452 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.488048 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.488046 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.494745 4759 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.494821 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0a9677f-60fe-4bcf-8262-250684b96537" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.511384 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbf55\" (UniqueName: \"kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55\") pod \"redhat-marketplace-tzncm\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:21 crc kubenswrapper[4759]: I1205 00:38:21.528336 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:22 crc kubenswrapper[4759]: I1205 00:38:22.015863 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:22 crc kubenswrapper[4759]: I1205 00:38:22.044364 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerID="0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885" exitCode=0 Dec 05 00:38:22 crc kubenswrapper[4759]: I1205 00:38:22.044444 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerDied","Data":"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885"} Dec 05 00:38:22 crc kubenswrapper[4759]: I1205 00:38:22.045655 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerStarted","Data":"5f9d7039123931e66804ed0b93df5d6dd46ec886ab9a07b7e04579173996f2d1"} Dec 05 00:38:23 crc kubenswrapper[4759]: I1205 00:38:23.057255 4759 generic.go:334] "Generic (PLEG): container finished" podID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerID="1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67" exitCode=0 Dec 05 00:38:23 crc kubenswrapper[4759]: I1205 00:38:23.057374 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerDied","Data":"1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67"} Dec 05 00:38:24 crc kubenswrapper[4759]: I1205 00:38:24.070336 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerStarted","Data":"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f"} Dec 05 00:38:24 crc kubenswrapper[4759]: I1205 00:38:24.955391 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:24 crc kubenswrapper[4759]: I1205 00:38:24.955981 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:25 crc kubenswrapper[4759]: I1205 00:38:25.017044 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:25 crc kubenswrapper[4759]: I1205 00:38:25.046631 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ncspd" podStartSLOduration=4.070551504 podStartE2EDuration="7.046588997s" podCreationTimestamp="2025-12-05 00:38:18 +0000 UTC" firstStartedPulling="2025-12-05 00:38:20.023556273 +0000 UTC m=+919.239217233" lastFinishedPulling="2025-12-05 00:38:22.999593766 +0000 UTC m=+922.215254726" observedRunningTime="2025-12-05 00:38:24.100094804 +0000 UTC m=+923.315755764" watchObservedRunningTime="2025-12-05 00:38:25.046588997 +0000 UTC m=+924.262249997" Dec 05 00:38:25 crc kubenswrapper[4759]: I1205 00:38:25.079297 4759 generic.go:334] "Generic (PLEG): container finished" podID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerID="b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302" exitCode=0 Dec 05 00:38:25 crc kubenswrapper[4759]: I1205 00:38:25.079423 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerDied","Data":"b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302"} Dec 05 00:38:25 crc kubenswrapper[4759]: I1205 00:38:25.134236 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:26 crc kubenswrapper[4759]: I1205 00:38:26.090599 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerStarted","Data":"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8"} Dec 05 00:38:26 crc kubenswrapper[4759]: I1205 00:38:26.116673 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzncm" podStartSLOduration=2.678306839 podStartE2EDuration="5.11665145s" podCreationTimestamp="2025-12-05 00:38:21 +0000 UTC" firstStartedPulling="2025-12-05 00:38:23.059910923 +0000 UTC m=+922.275571913" lastFinishedPulling="2025-12-05 00:38:25.498255574 +0000 UTC m=+924.713916524" observedRunningTime="2025-12-05 00:38:26.113225165 +0000 UTC m=+925.328886135" watchObservedRunningTime="2025-12-05 00:38:26.11665145 +0000 UTC m=+925.332312410" Dec 05 00:38:27 crc kubenswrapper[4759]: I1205 00:38:27.565863 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:27 crc kubenswrapper[4759]: I1205 00:38:27.566397 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bgnqw" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="registry-server" containerID="cri-o://2f7661969e429c9e9211852bec8da06714c8d90074dec6e1d86d544497634fdc" gracePeriod=2 Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.111479 4759 generic.go:334] "Generic (PLEG): container finished" podID="d877f548-40fd-440c-9d46-db2b321adc66" containerID="2f7661969e429c9e9211852bec8da06714c8d90074dec6e1d86d544497634fdc" exitCode=0 Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.111551 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerDied","Data":"2f7661969e429c9e9211852bec8da06714c8d90074dec6e1d86d544497634fdc"} Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.458711 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.597930 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content\") pod \"d877f548-40fd-440c-9d46-db2b321adc66\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.597988 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchpd\" (UniqueName: \"kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd\") pod \"d877f548-40fd-440c-9d46-db2b321adc66\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.598127 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities\") pod \"d877f548-40fd-440c-9d46-db2b321adc66\" (UID: \"d877f548-40fd-440c-9d46-db2b321adc66\") " Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.599774 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities" (OuterVolumeSpecName: "utilities") pod "d877f548-40fd-440c-9d46-db2b321adc66" (UID: "d877f548-40fd-440c-9d46-db2b321adc66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.607265 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd" (OuterVolumeSpecName: "kube-api-access-zchpd") pod "d877f548-40fd-440c-9d46-db2b321adc66" (UID: "d877f548-40fd-440c-9d46-db2b321adc66"). InnerVolumeSpecName "kube-api-access-zchpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.655535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d877f548-40fd-440c-9d46-db2b321adc66" (UID: "d877f548-40fd-440c-9d46-db2b321adc66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.699933 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.699985 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d877f548-40fd-440c-9d46-db2b321adc66-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:28 crc kubenswrapper[4759]: I1205 00:38:28.700007 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchpd\" (UniqueName: \"kubernetes.io/projected/d877f548-40fd-440c-9d46-db2b321adc66-kube-api-access-zchpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.126167 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgnqw" event={"ID":"d877f548-40fd-440c-9d46-db2b321adc66","Type":"ContainerDied","Data":"851bec1ae0fcf053c43c254c1a1ad4b9a82d01c1420bb66e5933d26f6caa3802"} Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.126250 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgnqw" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.126385 4759 scope.go:117] "RemoveContainer" containerID="2f7661969e429c9e9211852bec8da06714c8d90074dec6e1d86d544497634fdc" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.156990 4759 scope.go:117] "RemoveContainer" containerID="034919b9f8e0ee3c2f98ddc23d6ca131286e772b3249c45169aca7166fabdbdb" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.181597 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.187097 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bgnqw"] Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.188574 4759 scope.go:117] "RemoveContainer" containerID="4d54e741d271c7791cf0c89efdc0870bc50cd137a51a75412b0f0dea60d34dc6" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.305438 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.305505 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:29 crc kubenswrapper[4759]: I1205 00:38:29.352755 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:30 crc kubenswrapper[4759]: I1205 00:38:30.185612 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.171569 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d877f548-40fd-440c-9d46-db2b321adc66" path="/var/lib/kubelet/pods/d877f548-40fd-440c-9d46-db2b321adc66/volumes" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.494366 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.528779 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.528827 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.604271 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:31 crc kubenswrapper[4759]: I1205 00:38:31.966032 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:32 crc kubenswrapper[4759]: I1205 00:38:32.202350 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:33 crc kubenswrapper[4759]: I1205 00:38:33.163670 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ncspd" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="registry-server" containerID="cri-o://887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f" gracePeriod=2 Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.101041 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.171636 4759 generic.go:334] "Generic (PLEG): container finished" podID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerID="887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f" exitCode=0 Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.171692 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ncspd" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.171705 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerDied","Data":"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f"} Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.171804 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ncspd" event={"ID":"d3fe08d9-62b7-45e1-8666-e2fcd2323817","Type":"ContainerDied","Data":"62150f23ee9a04989bf1d1856d20f2bb65adfe331350d3c30bd4f625d3957a77"} Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.171841 4759 scope.go:117] "RemoveContainer" containerID="887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.187403 4759 scope.go:117] "RemoveContainer" containerID="0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.203424 4759 scope.go:117] "RemoveContainer" containerID="138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.240363 4759 scope.go:117] "RemoveContainer" containerID="887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f" Dec 05 00:38:34 crc kubenswrapper[4759]: E1205 00:38:34.240977 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f\": container with ID starting with 887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f not found: ID does not exist" containerID="887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.241021 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f"} err="failed to get container status \"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f\": rpc error: code = NotFound desc = could not find container \"887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f\": container with ID starting with 887c40c510f64c593d861590e768ccf72a09212c8f04f26b5f9e141eb65b136f not found: ID does not exist" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.241052 4759 scope.go:117] "RemoveContainer" containerID="0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885" Dec 05 00:38:34 crc kubenswrapper[4759]: E1205 00:38:34.241622 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885\": container with ID starting with 0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885 not found: ID does not exist" containerID="0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.241691 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885"} err="failed to get container status \"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885\": rpc error: code = NotFound desc = could not find container \"0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885\": container with ID starting with 0669c0edbcaece1561935f95dcfdee391bf2299b44594de8976a0886c4610885 not found: ID does not exist" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.241734 4759 scope.go:117] "RemoveContainer" containerID="138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93" Dec 05 00:38:34 crc kubenswrapper[4759]: E1205 00:38:34.242235 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93\": container with ID starting with 138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93 not found: ID does not exist" containerID="138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.242264 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93"} err="failed to get container status \"138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93\": rpc error: code = NotFound desc = could not find container \"138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93\": container with ID starting with 138f06fcd96511a32070e0cce3f5db529d6abe793cfa56791272b6aaf40b1e93 not found: ID does not exist" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.302515 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content\") pod \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.302729 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79xp9\" (UniqueName: \"kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9\") pod \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.302822 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities\") pod \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\" (UID: \"d3fe08d9-62b7-45e1-8666-e2fcd2323817\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.304436 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities" (OuterVolumeSpecName: "utilities") pod "d3fe08d9-62b7-45e1-8666-e2fcd2323817" (UID: "d3fe08d9-62b7-45e1-8666-e2fcd2323817"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.308891 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9" (OuterVolumeSpecName: "kube-api-access-79xp9") pod "d3fe08d9-62b7-45e1-8666-e2fcd2323817" (UID: "d3fe08d9-62b7-45e1-8666-e2fcd2323817"). InnerVolumeSpecName "kube-api-access-79xp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.357914 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.358213 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzncm" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="registry-server" containerID="cri-o://4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8" gracePeriod=2 Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.384008 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3fe08d9-62b7-45e1-8666-e2fcd2323817" (UID: "d3fe08d9-62b7-45e1-8666-e2fcd2323817"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.405076 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79xp9\" (UniqueName: \"kubernetes.io/projected/d3fe08d9-62b7-45e1-8666-e2fcd2323817-kube-api-access-79xp9\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.405378 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.405509 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3fe08d9-62b7-45e1-8666-e2fcd2323817-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.433542 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.433826 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.508433 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.512660 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ncspd"] Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.730128 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.911758 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content\") pod \"b3d7e677-bb14-4e6c-86bb-d466509de247\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.911811 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbf55\" (UniqueName: \"kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55\") pod \"b3d7e677-bb14-4e6c-86bb-d466509de247\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.911984 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities\") pod \"b3d7e677-bb14-4e6c-86bb-d466509de247\" (UID: \"b3d7e677-bb14-4e6c-86bb-d466509de247\") " Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.912846 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities" (OuterVolumeSpecName: "utilities") pod "b3d7e677-bb14-4e6c-86bb-d466509de247" (UID: "b3d7e677-bb14-4e6c-86bb-d466509de247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.918525 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55" (OuterVolumeSpecName: "kube-api-access-bbf55") pod "b3d7e677-bb14-4e6c-86bb-d466509de247" (UID: "b3d7e677-bb14-4e6c-86bb-d466509de247"). InnerVolumeSpecName "kube-api-access-bbf55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:38:34 crc kubenswrapper[4759]: I1205 00:38:34.930603 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d7e677-bb14-4e6c-86bb-d466509de247" (UID: "b3d7e677-bb14-4e6c-86bb-d466509de247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.013663 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.013931 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d7e677-bb14-4e6c-86bb-d466509de247-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.014050 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbf55\" (UniqueName: \"kubernetes.io/projected/b3d7e677-bb14-4e6c-86bb-d466509de247-kube-api-access-bbf55\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.173148 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" path="/var/lib/kubelet/pods/d3fe08d9-62b7-45e1-8666-e2fcd2323817/volumes" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.187375 4759 generic.go:334] "Generic (PLEG): container finished" podID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerID="4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8" exitCode=0 Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.187482 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzncm" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.187500 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerDied","Data":"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8"} Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.187544 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzncm" event={"ID":"b3d7e677-bb14-4e6c-86bb-d466509de247","Type":"ContainerDied","Data":"5f9d7039123931e66804ed0b93df5d6dd46ec886ab9a07b7e04579173996f2d1"} Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.187582 4759 scope.go:117] "RemoveContainer" containerID="4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.215785 4759 scope.go:117] "RemoveContainer" containerID="b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.243679 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.250458 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzncm"] Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.256241 4759 scope.go:117] "RemoveContainer" containerID="1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.276601 4759 scope.go:117] "RemoveContainer" containerID="4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8" Dec 05 00:38:35 crc kubenswrapper[4759]: E1205 00:38:35.277263 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8\": container with ID starting with 4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8 not found: ID does not exist" containerID="4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.277289 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8"} err="failed to get container status \"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8\": rpc error: code = NotFound desc = could not find container \"4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8\": container with ID starting with 4cbc961b1f85a20ef4c2872584c6d4b701d67081327fb6d839fc08df89c001a8 not found: ID does not exist" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.277330 4759 scope.go:117] "RemoveContainer" containerID="b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302" Dec 05 00:38:35 crc kubenswrapper[4759]: E1205 00:38:35.281504 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302\": container with ID starting with b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302 not found: ID does not exist" containerID="b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.281523 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302"} err="failed to get container status \"b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302\": rpc error: code = NotFound desc = could not find container \"b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302\": container with ID starting with b2e60893f41a02196a8daf04ca973328b55df3ee3b8671a28e6078d1848b0302 not found: ID does not exist" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.281538 4759 scope.go:117] "RemoveContainer" containerID="1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67" Dec 05 00:38:35 crc kubenswrapper[4759]: E1205 00:38:35.281910 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67\": container with ID starting with 1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67 not found: ID does not exist" containerID="1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67" Dec 05 00:38:35 crc kubenswrapper[4759]: I1205 00:38:35.281929 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67"} err="failed to get container status \"1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67\": rpc error: code = NotFound desc = could not find container \"1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67\": container with ID starting with 1f0aaf4968d65ea28d210537f95501098f73061410295aa1214a3e2e7a299a67 not found: ID does not exist" Dec 05 00:38:37 crc kubenswrapper[4759]: I1205 00:38:37.164766 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" path="/var/lib/kubelet/pods/b3d7e677-bb14-4e6c-86bb-d466509de247/volumes" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.513797 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-fxjdz"] Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514458 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514470 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514479 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514486 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514493 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514499 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514510 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514515 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514528 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514533 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514541 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514547 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514555 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514562 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514575 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514580 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="extract-content" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.514593 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514599 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="extract-utilities" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514700 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fe08d9-62b7-45e1-8666-e2fcd2323817" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514714 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d877f548-40fd-440c-9d46-db2b321adc66" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.514721 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d7e677-bb14-4e6c-86bb-d466509de247" containerName="registry-server" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.515176 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.518887 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.519453 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.520519 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.520623 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-jwkxb" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.520803 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.528857 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.549618 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-fxjdz"] Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566427 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566470 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566492 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566536 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566558 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9rx\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566576 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566610 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566625 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566641 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.566657 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.600360 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-fxjdz"] Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.600809 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-zj9rx metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-fxjdz" podUID="6d9a9828-37f6-4444-908b-8c3809d47eee" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667432 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667487 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667508 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667534 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667567 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667581 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667605 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667632 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667661 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667690 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667719 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9rx\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.667743 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.667921 4759 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Dec 05 00:38:50 crc kubenswrapper[4759]: E1205 00:38:50.667981 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics podName:6d9a9828-37f6-4444-908b-8c3809d47eee nodeName:}" failed. No retries permitted until 2025-12-05 00:38:51.167960857 +0000 UTC m=+950.383621807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics") pod "collector-fxjdz" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee") : secret "collector-metrics" not found Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.668462 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.668649 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.669212 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.670002 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.673357 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.675643 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.678660 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.688454 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:50 crc kubenswrapper[4759]: I1205 00:38:50.689549 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9rx\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.177929 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.186469 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") pod \"collector-fxjdz\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " pod="openshift-logging/collector-fxjdz" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.327067 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fxjdz" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.338391 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fxjdz" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.482890 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.482989 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483092 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj9rx\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483149 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483197 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483266 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483351 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483390 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483428 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483500 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.483553 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt\") pod \"6d9a9828-37f6-4444-908b-8c3809d47eee\" (UID: \"6d9a9828-37f6-4444-908b-8c3809d47eee\") " Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.484082 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir" (OuterVolumeSpecName: "datadir") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.484377 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config" (OuterVolumeSpecName: "config") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.484429 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.484502 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.484894 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.488579 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token" (OuterVolumeSpecName: "sa-token") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.489246 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token" (OuterVolumeSpecName: "collector-token") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.489496 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics" (OuterVolumeSpecName: "metrics") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.489548 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp" (OuterVolumeSpecName: "tmp") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.489553 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx" (OuterVolumeSpecName: "kube-api-access-zj9rx") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "kube-api-access-zj9rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.490411 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "6d9a9828-37f6-4444-908b-8c3809d47eee" (UID: "6d9a9828-37f6-4444-908b-8c3809d47eee"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.584970 4759 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6d9a9828-37f6-4444-908b-8c3809d47eee-datadir\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585032 4759 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585054 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585102 4759 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d9a9828-37f6-4444-908b-8c3809d47eee-tmp\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585120 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj9rx\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-kube-api-access-zj9rx\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585139 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585156 4759 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585174 4759 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585190 4759 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6d9a9828-37f6-4444-908b-8c3809d47eee-collector-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585208 4759 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a9828-37f6-4444-908b-8c3809d47eee-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:51 crc kubenswrapper[4759]: I1205 00:38:51.585225 4759 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6d9a9828-37f6-4444-908b-8c3809d47eee-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.331533 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fxjdz" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.388291 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-fxjdz"] Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.393028 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-fxjdz"] Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.409073 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-dx9kr"] Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.410168 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.414213 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.414608 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-jwkxb" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.417114 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.417271 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.417286 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.431992 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.451544 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-dx9kr"] Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500136 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config-openshift-service-cacrt\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500211 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500270 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500391 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-sa-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-trusted-ca\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500520 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-tmp\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500552 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-entrypoint\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500665 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fpn\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-kube-api-access-s7fpn\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500720 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-syslog-receiver\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500748 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-metrics\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.500785 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-datadir\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601724 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-datadir\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601828 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config-openshift-service-cacrt\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601866 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601903 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601945 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-sa-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601982 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-trusted-ca\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602028 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-tmp\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602059 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-entrypoint\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602123 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fpn\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-kube-api-access-s7fpn\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602168 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-syslog-receiver\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602206 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-metrics\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.602962 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config-openshift-service-cacrt\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.601824 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-datadir\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.604823 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-entrypoint\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.606252 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-trusted-ca\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.606484 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-config\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.609360 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-syslog-receiver\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.611356 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-collector-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.617890 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-tmp\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.622526 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-metrics\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.622705 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-sa-token\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.635825 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fpn\" (UniqueName: \"kubernetes.io/projected/35a9bf94-4e4a-4d68-95d8-1ae9421bb76f-kube-api-access-s7fpn\") pod \"collector-dx9kr\" (UID: \"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f\") " pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.747987 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dx9kr" Dec 05 00:38:52 crc kubenswrapper[4759]: I1205 00:38:52.962589 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-dx9kr"] Dec 05 00:38:53 crc kubenswrapper[4759]: I1205 00:38:53.168124 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9a9828-37f6-4444-908b-8c3809d47eee" path="/var/lib/kubelet/pods/6d9a9828-37f6-4444-908b-8c3809d47eee/volumes" Dec 05 00:38:53 crc kubenswrapper[4759]: I1205 00:38:53.338545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-dx9kr" event={"ID":"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f","Type":"ContainerStarted","Data":"dd74bbd1f8979f232849f7be8994f360f9eb590ff7cb8ecf69da57a3827edd2f"} Dec 05 00:38:59 crc kubenswrapper[4759]: I1205 00:38:59.377113 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-dx9kr" event={"ID":"35a9bf94-4e4a-4d68-95d8-1ae9421bb76f","Type":"ContainerStarted","Data":"73570c31cc1904875b372427d4a500e1c318cd40b2a7b312f4c0a0c7cae3fbdd"} Dec 05 00:38:59 crc kubenswrapper[4759]: I1205 00:38:59.406296 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-dx9kr" podStartSLOduration=1.69521481 podStartE2EDuration="7.4062775s" podCreationTimestamp="2025-12-05 00:38:52 +0000 UTC" firstStartedPulling="2025-12-05 00:38:52.976046252 +0000 UTC m=+952.191707202" lastFinishedPulling="2025-12-05 00:38:58.687108942 +0000 UTC m=+957.902769892" observedRunningTime="2025-12-05 00:38:59.405295777 +0000 UTC m=+958.620956787" watchObservedRunningTime="2025-12-05 00:38:59.4062775 +0000 UTC m=+958.621938460" Dec 05 00:39:04 crc kubenswrapper[4759]: I1205 00:39:04.433641 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:39:04 crc kubenswrapper[4759]: I1205 00:39:04.434294 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:39:04 crc kubenswrapper[4759]: I1205 00:39:04.434395 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:39:04 crc kubenswrapper[4759]: I1205 00:39:04.435366 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:39:04 crc kubenswrapper[4759]: I1205 00:39:04.435524 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b" gracePeriod=600 Dec 05 00:39:05 crc kubenswrapper[4759]: I1205 00:39:05.437103 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b" exitCode=0 Dec 05 00:39:05 crc kubenswrapper[4759]: I1205 00:39:05.437167 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b"} Dec 05 00:39:05 crc kubenswrapper[4759]: I1205 00:39:05.437587 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce"} Dec 05 00:39:05 crc kubenswrapper[4759]: I1205 00:39:05.437618 4759 scope.go:117] "RemoveContainer" containerID="18e2e3c4d5d7e9b0a92421362e0a25a15c418034c2ed08a024ef5b3fb196fc6f" Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.834481 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz"] Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.836586 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.839691 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.845904 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz"] Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.924314 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dk55\" (UniqueName: \"kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.924433 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:24 crc kubenswrapper[4759]: I1205 00:39:24.924542 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.026361 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dk55\" (UniqueName: \"kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.026428 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.026485 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.027138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.027341 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.049044 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dk55\" (UniqueName: \"kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.196569 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:25 crc kubenswrapper[4759]: I1205 00:39:25.631630 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz"] Dec 05 00:39:26 crc kubenswrapper[4759]: I1205 00:39:26.601814 4759 generic.go:334] "Generic (PLEG): container finished" podID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerID="4708ebc9cd5316ff4f9382b996f579d5093d02bd99b7e2ba458567dff6a820d1" exitCode=0 Dec 05 00:39:26 crc kubenswrapper[4759]: I1205 00:39:26.601898 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" event={"ID":"1872bbfd-0448-4ee8-af95-9b4db67c58c9","Type":"ContainerDied","Data":"4708ebc9cd5316ff4f9382b996f579d5093d02bd99b7e2ba458567dff6a820d1"} Dec 05 00:39:26 crc kubenswrapper[4759]: I1205 00:39:26.601983 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" event={"ID":"1872bbfd-0448-4ee8-af95-9b4db67c58c9","Type":"ContainerStarted","Data":"b8dc17ab2abfa258138ab5bc4f5df48814e771e3e467351dee83448b3e57f290"} Dec 05 00:39:28 crc kubenswrapper[4759]: I1205 00:39:28.619205 4759 generic.go:334] "Generic (PLEG): container finished" podID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerID="cde2022acbf2299c5ae07366afc30538f48ea59ead78bd75f282cb918d5348de" exitCode=0 Dec 05 00:39:28 crc kubenswrapper[4759]: I1205 00:39:28.619275 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" event={"ID":"1872bbfd-0448-4ee8-af95-9b4db67c58c9","Type":"ContainerDied","Data":"cde2022acbf2299c5ae07366afc30538f48ea59ead78bd75f282cb918d5348de"} Dec 05 00:39:29 crc kubenswrapper[4759]: I1205 00:39:29.635936 4759 generic.go:334] "Generic (PLEG): container finished" podID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerID="69c6d9d80ea58bd26843a08ebf13e93220760b7c484c8be12ada48806f5d304a" exitCode=0 Dec 05 00:39:29 crc kubenswrapper[4759]: I1205 00:39:29.636564 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" event={"ID":"1872bbfd-0448-4ee8-af95-9b4db67c58c9","Type":"ContainerDied","Data":"69c6d9d80ea58bd26843a08ebf13e93220760b7c484c8be12ada48806f5d304a"} Dec 05 00:39:30 crc kubenswrapper[4759]: I1205 00:39:30.993086 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.126555 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util\") pod \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.126883 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dk55\" (UniqueName: \"kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55\") pod \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.127054 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle\") pod \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\" (UID: \"1872bbfd-0448-4ee8-af95-9b4db67c58c9\") " Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.127561 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle" (OuterVolumeSpecName: "bundle") pod "1872bbfd-0448-4ee8-af95-9b4db67c58c9" (UID: "1872bbfd-0448-4ee8-af95-9b4db67c58c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.131826 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55" (OuterVolumeSpecName: "kube-api-access-8dk55") pod "1872bbfd-0448-4ee8-af95-9b4db67c58c9" (UID: "1872bbfd-0448-4ee8-af95-9b4db67c58c9"). InnerVolumeSpecName "kube-api-access-8dk55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.229672 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.229710 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dk55\" (UniqueName: \"kubernetes.io/projected/1872bbfd-0448-4ee8-af95-9b4db67c58c9-kube-api-access-8dk55\") on node \"crc\" DevicePath \"\"" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.327895 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util" (OuterVolumeSpecName: "util") pod "1872bbfd-0448-4ee8-af95-9b4db67c58c9" (UID: "1872bbfd-0448-4ee8-af95-9b4db67c58c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.331415 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1872bbfd-0448-4ee8-af95-9b4db67c58c9-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.652106 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" event={"ID":"1872bbfd-0448-4ee8-af95-9b4db67c58c9","Type":"ContainerDied","Data":"b8dc17ab2abfa258138ab5bc4f5df48814e771e3e467351dee83448b3e57f290"} Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.652179 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8dc17ab2abfa258138ab5bc4f5df48814e771e3e467351dee83448b3e57f290" Dec 05 00:39:31 crc kubenswrapper[4759]: I1205 00:39:31.652603 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.585374 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6"] Dec 05 00:39:34 crc kubenswrapper[4759]: E1205 00:39:34.586023 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="util" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.586041 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="util" Dec 05 00:39:34 crc kubenswrapper[4759]: E1205 00:39:34.586056 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="pull" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.586064 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="pull" Dec 05 00:39:34 crc kubenswrapper[4759]: E1205 00:39:34.586078 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="extract" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.586086 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="extract" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.586235 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1872bbfd-0448-4ee8-af95-9b4db67c58c9" containerName="extract" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.586942 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.589679 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-hzfn9" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.589939 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.590152 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.598754 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6"] Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.675204 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzrz\" (UniqueName: \"kubernetes.io/projected/c0583a6d-7e56-455f-8557-f78732ffd0dc-kube-api-access-sgzrz\") pod \"nmstate-operator-5b5b58f5c8-w55p6\" (UID: \"c0583a6d-7e56-455f-8557-f78732ffd0dc\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.776827 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzrz\" (UniqueName: \"kubernetes.io/projected/c0583a6d-7e56-455f-8557-f78732ffd0dc-kube-api-access-sgzrz\") pod \"nmstate-operator-5b5b58f5c8-w55p6\" (UID: \"c0583a6d-7e56-455f-8557-f78732ffd0dc\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.798731 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzrz\" (UniqueName: \"kubernetes.io/projected/c0583a6d-7e56-455f-8557-f78732ffd0dc-kube-api-access-sgzrz\") pod \"nmstate-operator-5b5b58f5c8-w55p6\" (UID: \"c0583a6d-7e56-455f-8557-f78732ffd0dc\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" Dec 05 00:39:34 crc kubenswrapper[4759]: I1205 00:39:34.905991 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" Dec 05 00:39:35 crc kubenswrapper[4759]: I1205 00:39:35.361881 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6"] Dec 05 00:39:35 crc kubenswrapper[4759]: I1205 00:39:35.676437 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" event={"ID":"c0583a6d-7e56-455f-8557-f78732ffd0dc","Type":"ContainerStarted","Data":"a00f2cc3bed975a91b5c59c0bdcfd17694233f4e68c2c6dd9e3aef534e3206ac"} Dec 05 00:39:37 crc kubenswrapper[4759]: I1205 00:39:37.690032 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" event={"ID":"c0583a6d-7e56-455f-8557-f78732ffd0dc","Type":"ContainerStarted","Data":"fc1d3a1e739c3206d332bf69c63c448a5750be6223c07f22ebc345d5a1b66841"} Dec 05 00:39:37 crc kubenswrapper[4759]: I1205 00:39:37.711732 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w55p6" podStartSLOduration=1.815823581 podStartE2EDuration="3.71171155s" podCreationTimestamp="2025-12-05 00:39:34 +0000 UTC" firstStartedPulling="2025-12-05 00:39:35.369546918 +0000 UTC m=+994.585207858" lastFinishedPulling="2025-12-05 00:39:37.265434877 +0000 UTC m=+996.481095827" observedRunningTime="2025-12-05 00:39:37.705194327 +0000 UTC m=+996.920855277" watchObservedRunningTime="2025-12-05 00:39:37.71171155 +0000 UTC m=+996.927372500" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.637917 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.638961 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.642266 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nnmp8" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.649264 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.650422 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.652843 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.653590 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.669608 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.711978 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-nsbmm"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.713085 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.735743 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmg2t\" (UniqueName: \"kubernetes.io/projected/463bc80d-5fb1-4bf0-b596-4f41571b3178-kube-api-access-lmg2t\") pod \"nmstate-metrics-7f946cbc9-hhc6d\" (UID: \"463bc80d-5fb1-4bf0-b596-4f41571b3178\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.735802 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8552v\" (UniqueName: \"kubernetes.io/projected/05de94e3-b61f-4df3-a8f4-a0b97d65b575-kube-api-access-8552v\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.735837 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.836981 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8552v\" (UniqueName: \"kubernetes.io/projected/05de94e3-b61f-4df3-a8f4-a0b97d65b575-kube-api-access-8552v\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837041 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837082 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-dbus-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837128 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-nmstate-lock\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837159 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-ovs-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837179 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwqn\" (UniqueName: \"kubernetes.io/projected/1c4aa01f-df16-4f20-914f-1238c9c497ab-kube-api-access-dqwqn\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: E1205 00:39:38.837330 4759 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 00:39:38 crc kubenswrapper[4759]: E1205 00:39:38.837407 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair podName:05de94e3-b61f-4df3-a8f4-a0b97d65b575 nodeName:}" failed. No retries permitted until 2025-12-05 00:39:39.337382853 +0000 UTC m=+998.553043803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-9ztsr" (UID: "05de94e3-b61f-4df3-a8f4-a0b97d65b575") : secret "openshift-nmstate-webhook" not found Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.837565 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmg2t\" (UniqueName: \"kubernetes.io/projected/463bc80d-5fb1-4bf0-b596-4f41571b3178-kube-api-access-lmg2t\") pod \"nmstate-metrics-7f946cbc9-hhc6d\" (UID: \"463bc80d-5fb1-4bf0-b596-4f41571b3178\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.840249 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.841535 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.846808 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.847067 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-plsn7" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.851491 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.866832 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc"] Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.878963 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmg2t\" (UniqueName: \"kubernetes.io/projected/463bc80d-5fb1-4bf0-b596-4f41571b3178-kube-api-access-lmg2t\") pod \"nmstate-metrics-7f946cbc9-hhc6d\" (UID: \"463bc80d-5fb1-4bf0-b596-4f41571b3178\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.879252 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8552v\" (UniqueName: \"kubernetes.io/projected/05de94e3-b61f-4df3-a8f4-a0b97d65b575-kube-api-access-8552v\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.938963 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hcvb\" (UniqueName: \"kubernetes.io/projected/93f7aeec-1ff3-4cec-80b9-683bfda8584b-kube-api-access-9hcvb\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939030 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93f7aeec-1ff3-4cec-80b9-683bfda8584b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939079 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-dbus-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939120 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-nmstate-lock\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939141 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-ovs-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939161 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwqn\" (UniqueName: \"kubernetes.io/projected/1c4aa01f-df16-4f20-914f-1238c9c497ab-kube-api-access-dqwqn\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939177 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939457 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-ovs-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939523 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-nmstate-lock\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.939546 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c4aa01f-df16-4f20-914f-1238c9c497ab-dbus-socket\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.956075 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" Dec 05 00:39:38 crc kubenswrapper[4759]: I1205 00:39:38.960770 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwqn\" (UniqueName: \"kubernetes.io/projected/1c4aa01f-df16-4f20-914f-1238c9c497ab-kube-api-access-dqwqn\") pod \"nmstate-handler-nsbmm\" (UID: \"1c4aa01f-df16-4f20-914f-1238c9c497ab\") " pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.008801 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.009698 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.029100 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.030562 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.043231 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93f7aeec-1ff3-4cec-80b9-683bfda8584b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.043378 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.043491 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hcvb\" (UniqueName: \"kubernetes.io/projected/93f7aeec-1ff3-4cec-80b9-683bfda8584b-kube-api-access-9hcvb\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.045293 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93f7aeec-1ff3-4cec-80b9-683bfda8584b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: E1205 00:39:39.045421 4759 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 05 00:39:39 crc kubenswrapper[4759]: E1205 00:39:39.045498 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert podName:93f7aeec-1ff3-4cec-80b9-683bfda8584b nodeName:}" failed. No retries permitted until 2025-12-05 00:39:39.545460569 +0000 UTC m=+998.761121519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-srcnc" (UID: "93f7aeec-1ff3-4cec-80b9-683bfda8584b") : secret "plugin-serving-cert" not found Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.072377 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hcvb\" (UniqueName: \"kubernetes.io/projected/93f7aeec-1ff3-4cec-80b9-683bfda8584b-kube-api-access-9hcvb\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: W1205 00:39:39.093872 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4aa01f_df16_4f20_914f_1238c9c497ab.slice/crio-f30310b80b6eef46dfa390a1c03de14941289d5bd14d52d750967ee925e93f10 WatchSource:0}: Error finding container f30310b80b6eef46dfa390a1c03de14941289d5bd14d52d750967ee925e93f10: Status 404 returned error can't find the container with id f30310b80b6eef46dfa390a1c03de14941289d5bd14d52d750967ee925e93f10 Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146521 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146586 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mh6\" (UniqueName: \"kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146639 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146695 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146728 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.146756 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249434 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249518 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249582 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249626 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mh6\" (UniqueName: \"kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249690 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249734 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.249764 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.251051 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.251275 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.251946 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.252970 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.254424 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.257059 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.267287 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mh6\" (UniqueName: \"kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6\") pod \"console-7fb5bf68f5-jznkg\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.333328 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.351615 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.355217 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/05de94e3-b61f-4df3-a8f4-a0b97d65b575-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9ztsr\" (UID: \"05de94e3-b61f-4df3-a8f4-a0b97d65b575\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.464875 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d"] Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.556861 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.557796 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.560799 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f7aeec-1ff3-4cec-80b9-683bfda8584b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-srcnc\" (UID: \"93f7aeec-1ff3-4cec-80b9-683bfda8584b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: W1205 00:39:39.565719 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220b0801_31e1_4193_9b01_30a191741f12.slice/crio-0a7998bd64b4a8dccf01ce3b278a4361fdc42d1c917d8a3c88f696f72281cb5e WatchSource:0}: Error finding container 0a7998bd64b4a8dccf01ce3b278a4361fdc42d1c917d8a3c88f696f72281cb5e: Status 404 returned error can't find the container with id 0a7998bd64b4a8dccf01ce3b278a4361fdc42d1c917d8a3c88f696f72281cb5e Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.567346 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.751641 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fb5bf68f5-jznkg" event={"ID":"220b0801-31e1-4193-9b01-30a191741f12","Type":"ContainerStarted","Data":"0a7998bd64b4a8dccf01ce3b278a4361fdc42d1c917d8a3c88f696f72281cb5e"} Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.756970 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.762527 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nsbmm" event={"ID":"1c4aa01f-df16-4f20-914f-1238c9c497ab","Type":"ContainerStarted","Data":"f30310b80b6eef46dfa390a1c03de14941289d5bd14d52d750967ee925e93f10"} Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.763387 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" event={"ID":"463bc80d-5fb1-4bf0-b596-4f41571b3178","Type":"ContainerStarted","Data":"c1d55ab9f22908e5670491eb627d36b4ff0e95f2a7eaef93609a19b5ebf81ac3"} Dec 05 00:39:39 crc kubenswrapper[4759]: I1205 00:39:39.945861 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr"] Dec 05 00:39:40 crc kubenswrapper[4759]: I1205 00:39:40.231819 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc"] Dec 05 00:39:40 crc kubenswrapper[4759]: W1205 00:39:40.234579 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f7aeec_1ff3_4cec_80b9_683bfda8584b.slice/crio-10e760452ac65991e73ecc69cc7fbed1fdb34fb3d2c97777c3377f96c66c7f42 WatchSource:0}: Error finding container 10e760452ac65991e73ecc69cc7fbed1fdb34fb3d2c97777c3377f96c66c7f42: Status 404 returned error can't find the container with id 10e760452ac65991e73ecc69cc7fbed1fdb34fb3d2c97777c3377f96c66c7f42 Dec 05 00:39:40 crc kubenswrapper[4759]: I1205 00:39:40.773063 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" event={"ID":"93f7aeec-1ff3-4cec-80b9-683bfda8584b","Type":"ContainerStarted","Data":"10e760452ac65991e73ecc69cc7fbed1fdb34fb3d2c97777c3377f96c66c7f42"} Dec 05 00:39:40 crc kubenswrapper[4759]: I1205 00:39:40.775706 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fb5bf68f5-jznkg" event={"ID":"220b0801-31e1-4193-9b01-30a191741f12","Type":"ContainerStarted","Data":"3a8656dfb732f941f029996bc734a318c98e85bacf407593445f85cb8b644bf4"} Dec 05 00:39:40 crc kubenswrapper[4759]: I1205 00:39:40.780594 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" event={"ID":"05de94e3-b61f-4df3-a8f4-a0b97d65b575","Type":"ContainerStarted","Data":"002db4a609f4e36ca2560b083b0aa9422285e7df19982bdbc7b384254b579f29"} Dec 05 00:39:41 crc kubenswrapper[4759]: I1205 00:39:41.176354 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fb5bf68f5-jznkg" podStartSLOduration=3.176321035 podStartE2EDuration="3.176321035s" podCreationTimestamp="2025-12-05 00:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:39:40.802448041 +0000 UTC m=+1000.018108991" watchObservedRunningTime="2025-12-05 00:39:41.176321035 +0000 UTC m=+1000.391981985" Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.811023 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" event={"ID":"05de94e3-b61f-4df3-a8f4-a0b97d65b575","Type":"ContainerStarted","Data":"54a0a4766834ad0289e0e10333453e66fd8dd1b4f9573e740a106402fb95ec15"} Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.812444 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.813823 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nsbmm" event={"ID":"1c4aa01f-df16-4f20-914f-1238c9c497ab","Type":"ContainerStarted","Data":"e35ab0b0420de3a35026fae7b1f2e655b56878fe577e886c3974c2e97536cfcc"} Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.814985 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.817840 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" event={"ID":"463bc80d-5fb1-4bf0-b596-4f41571b3178","Type":"ContainerStarted","Data":"c6b3b9def908000e0246f535bd6dbdf92c2e33da375701ab1810878ee881ccfa"} Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.835466 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" podStartSLOduration=2.516463508 podStartE2EDuration="4.835440755s" podCreationTimestamp="2025-12-05 00:39:38 +0000 UTC" firstStartedPulling="2025-12-05 00:39:39.958671187 +0000 UTC m=+999.174332137" lastFinishedPulling="2025-12-05 00:39:42.277648414 +0000 UTC m=+1001.493309384" observedRunningTime="2025-12-05 00:39:42.830603835 +0000 UTC m=+1002.046264795" watchObservedRunningTime="2025-12-05 00:39:42.835440755 +0000 UTC m=+1002.051101705" Dec 05 00:39:42 crc kubenswrapper[4759]: I1205 00:39:42.850992 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-nsbmm" podStartSLOduration=1.712665587 podStartE2EDuration="4.850974891s" podCreationTimestamp="2025-12-05 00:39:38 +0000 UTC" firstStartedPulling="2025-12-05 00:39:39.096172938 +0000 UTC m=+998.311833888" lastFinishedPulling="2025-12-05 00:39:42.234482242 +0000 UTC m=+1001.450143192" observedRunningTime="2025-12-05 00:39:42.847820963 +0000 UTC m=+1002.063481913" watchObservedRunningTime="2025-12-05 00:39:42.850974891 +0000 UTC m=+1002.066635841" Dec 05 00:39:43 crc kubenswrapper[4759]: I1205 00:39:43.825887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" event={"ID":"93f7aeec-1ff3-4cec-80b9-683bfda8584b","Type":"ContainerStarted","Data":"90008fa2683154dd1e051442cddd2080ae0cf8a778ea2c5dc8e409a887054b25"} Dec 05 00:39:43 crc kubenswrapper[4759]: I1205 00:39:43.842320 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-srcnc" podStartSLOduration=2.5153291700000002 podStartE2EDuration="5.842299728s" podCreationTimestamp="2025-12-05 00:39:38 +0000 UTC" firstStartedPulling="2025-12-05 00:39:40.236454465 +0000 UTC m=+999.452115415" lastFinishedPulling="2025-12-05 00:39:43.563425023 +0000 UTC m=+1002.779085973" observedRunningTime="2025-12-05 00:39:43.84156518 +0000 UTC m=+1003.057226150" watchObservedRunningTime="2025-12-05 00:39:43.842299728 +0000 UTC m=+1003.057960678" Dec 05 00:39:45 crc kubenswrapper[4759]: I1205 00:39:45.840196 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" event={"ID":"463bc80d-5fb1-4bf0-b596-4f41571b3178","Type":"ContainerStarted","Data":"5fdbd0ef3753d0c5f661aef2b2f4ff501559b24eac41a9f7095b6bea33bf0587"} Dec 05 00:39:45 crc kubenswrapper[4759]: I1205 00:39:45.854547 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-hhc6d" podStartSLOduration=2.617891206 podStartE2EDuration="7.854529976s" podCreationTimestamp="2025-12-05 00:39:38 +0000 UTC" firstStartedPulling="2025-12-05 00:39:39.4763768 +0000 UTC m=+998.692037750" lastFinishedPulling="2025-12-05 00:39:44.71301558 +0000 UTC m=+1003.928676520" observedRunningTime="2025-12-05 00:39:45.853338517 +0000 UTC m=+1005.068999477" watchObservedRunningTime="2025-12-05 00:39:45.854529976 +0000 UTC m=+1005.070190926" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.065777 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-nsbmm" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.334165 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.334542 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.340853 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.870473 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:39:49 crc kubenswrapper[4759]: I1205 00:39:49.981891 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:39:59 crc kubenswrapper[4759]: I1205 00:39:59.575807 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9ztsr" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.034633 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-g8mxq" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" containerID="cri-o://6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd" gracePeriod=15 Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.487676 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g8mxq_576c976f-56ce-4409-8654-e9a6264a71d1/console/0.log" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.488104 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518767 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518903 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518929 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518957 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518979 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b55v9\" (UniqueName: \"kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.518994 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.519066 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert\") pod \"576c976f-56ce-4409-8654-e9a6264a71d1\" (UID: \"576c976f-56ce-4409-8654-e9a6264a71d1\") " Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.520325 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.520652 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.521259 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config" (OuterVolumeSpecName: "console-config") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.524357 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.524796 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.525382 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9" (OuterVolumeSpecName: "kube-api-access-b55v9") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "kube-api-access-b55v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.526152 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "576c976f-56ce-4409-8654-e9a6264a71d1" (UID: "576c976f-56ce-4409-8654-e9a6264a71d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627666 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627713 4759 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627733 4759 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/576c976f-56ce-4409-8654-e9a6264a71d1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627746 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b55v9\" (UniqueName: \"kubernetes.io/projected/576c976f-56ce-4409-8654-e9a6264a71d1-kube-api-access-b55v9\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627761 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627778 4759 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:15 crc kubenswrapper[4759]: I1205 00:40:15.627791 4759 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/576c976f-56ce-4409-8654-e9a6264a71d1-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063056 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g8mxq_576c976f-56ce-4409-8654-e9a6264a71d1/console/0.log" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063104 4759 generic.go:334] "Generic (PLEG): container finished" podID="576c976f-56ce-4409-8654-e9a6264a71d1" containerID="6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd" exitCode=2 Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8mxq" event={"ID":"576c976f-56ce-4409-8654-e9a6264a71d1","Type":"ContainerDied","Data":"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd"} Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063170 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8mxq" event={"ID":"576c976f-56ce-4409-8654-e9a6264a71d1","Type":"ContainerDied","Data":"a140b1bdb62a6b99086c8f3fbdcf7a4e1d9979505a0efb9e232d918a05505ea8"} Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063189 4759 scope.go:117] "RemoveContainer" containerID="6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.063197 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8mxq" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.101383 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.105211 4759 scope.go:117] "RemoveContainer" containerID="6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd" Dec 05 00:40:16 crc kubenswrapper[4759]: E1205 00:40:16.105932 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd\": container with ID starting with 6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd not found: ID does not exist" containerID="6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.105995 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd"} err="failed to get container status \"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd\": rpc error: code = NotFound desc = could not find container \"6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd\": container with ID starting with 6736307c03a9c722b175228ee64a9dca910eda8fc6edaf5c2ab3259688e5e1cd not found: ID does not exist" Dec 05 00:40:16 crc kubenswrapper[4759]: I1205 00:40:16.107757 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-g8mxq"] Dec 05 00:40:17 crc kubenswrapper[4759]: I1205 00:40:17.163206 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" path="/var/lib/kubelet/pods/576c976f-56ce-4409-8654-e9a6264a71d1/volumes" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.236174 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd"] Dec 05 00:40:19 crc kubenswrapper[4759]: E1205 00:40:19.236736 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.236750 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.236865 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="576c976f-56ce-4409-8654-e9a6264a71d1" containerName="console" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.237758 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.240945 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.256588 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd"] Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.375167 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvz8w\" (UniqueName: \"kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.375254 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.375330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.476479 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvz8w\" (UniqueName: \"kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.476569 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.476608 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.477158 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.477211 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.499778 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvz8w\" (UniqueName: \"kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:19 crc kubenswrapper[4759]: I1205 00:40:19.558101 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:20 crc kubenswrapper[4759]: I1205 00:40:20.052232 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd"] Dec 05 00:40:20 crc kubenswrapper[4759]: I1205 00:40:20.111545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" event={"ID":"ac9a191d-1c37-4695-82d2-d502ed5245ff","Type":"ContainerStarted","Data":"0203f3925ccf99e7e7178ba80f26f11c7071f5ae4e741af2da8c6f2c42861584"} Dec 05 00:40:21 crc kubenswrapper[4759]: I1205 00:40:21.122725 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerID="d03b4cdb1da7ede8535e8f0dc6a4e10db580166dc2e61e2d7680299c482e8d02" exitCode=0 Dec 05 00:40:21 crc kubenswrapper[4759]: I1205 00:40:21.122963 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" event={"ID":"ac9a191d-1c37-4695-82d2-d502ed5245ff","Type":"ContainerDied","Data":"d03b4cdb1da7ede8535e8f0dc6a4e10db580166dc2e61e2d7680299c482e8d02"} Dec 05 00:40:21 crc kubenswrapper[4759]: I1205 00:40:21.126047 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:40:23 crc kubenswrapper[4759]: I1205 00:40:23.138803 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerID="63d52faef625e7b970116140ac510b5caea4a87a08d37a5ef0d0142e9767ac8f" exitCode=0 Dec 05 00:40:23 crc kubenswrapper[4759]: I1205 00:40:23.138848 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" event={"ID":"ac9a191d-1c37-4695-82d2-d502ed5245ff","Type":"ContainerDied","Data":"63d52faef625e7b970116140ac510b5caea4a87a08d37a5ef0d0142e9767ac8f"} Dec 05 00:40:24 crc kubenswrapper[4759]: I1205 00:40:24.150460 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerID="ce99e28ceafa4c86e7d0c265b9646000c96c0109285e5241d7e5138de08c8848" exitCode=0 Dec 05 00:40:24 crc kubenswrapper[4759]: I1205 00:40:24.150621 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" event={"ID":"ac9a191d-1c37-4695-82d2-d502ed5245ff","Type":"ContainerDied","Data":"ce99e28ceafa4c86e7d0c265b9646000c96c0109285e5241d7e5138de08c8848"} Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.516474 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.682907 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util\") pod \"ac9a191d-1c37-4695-82d2-d502ed5245ff\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.683023 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvz8w\" (UniqueName: \"kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w\") pod \"ac9a191d-1c37-4695-82d2-d502ed5245ff\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.683121 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle\") pod \"ac9a191d-1c37-4695-82d2-d502ed5245ff\" (UID: \"ac9a191d-1c37-4695-82d2-d502ed5245ff\") " Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.685066 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle" (OuterVolumeSpecName: "bundle") pod "ac9a191d-1c37-4695-82d2-d502ed5245ff" (UID: "ac9a191d-1c37-4695-82d2-d502ed5245ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.697641 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w" (OuterVolumeSpecName: "kube-api-access-fvz8w") pod "ac9a191d-1c37-4695-82d2-d502ed5245ff" (UID: "ac9a191d-1c37-4695-82d2-d502ed5245ff"). InnerVolumeSpecName "kube-api-access-fvz8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.698815 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util" (OuterVolumeSpecName: "util") pod "ac9a191d-1c37-4695-82d2-d502ed5245ff" (UID: "ac9a191d-1c37-4695-82d2-d502ed5245ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.784999 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.785039 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvz8w\" (UniqueName: \"kubernetes.io/projected/ac9a191d-1c37-4695-82d2-d502ed5245ff-kube-api-access-fvz8w\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:25 crc kubenswrapper[4759]: I1205 00:40:25.785049 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac9a191d-1c37-4695-82d2-d502ed5245ff-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:40:26 crc kubenswrapper[4759]: I1205 00:40:26.168629 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" event={"ID":"ac9a191d-1c37-4695-82d2-d502ed5245ff","Type":"ContainerDied","Data":"0203f3925ccf99e7e7178ba80f26f11c7071f5ae4e741af2da8c6f2c42861584"} Dec 05 00:40:26 crc kubenswrapper[4759]: I1205 00:40:26.168887 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0203f3925ccf99e7e7178ba80f26f11c7071f5ae4e741af2da8c6f2c42861584" Dec 05 00:40:26 crc kubenswrapper[4759]: I1205 00:40:26.168697 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.900209 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp"] Dec 05 00:40:35 crc kubenswrapper[4759]: E1205 00:40:35.900986 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="util" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.900999 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="util" Dec 05 00:40:35 crc kubenswrapper[4759]: E1205 00:40:35.901013 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="extract" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.901018 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="extract" Dec 05 00:40:35 crc kubenswrapper[4759]: E1205 00:40:35.901027 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="pull" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.901033 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="pull" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.901148 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac9a191d-1c37-4695-82d2-d502ed5245ff" containerName="extract" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.901621 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.905082 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-s2rpc" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.905188 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.906165 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.909803 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.913822 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 00:40:35 crc kubenswrapper[4759]: I1205 00:40:35.981328 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp"] Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.023521 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-webhook-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.023584 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2zgj\" (UniqueName: \"kubernetes.io/projected/f5b08a58-e4f1-4520-aec9-e0f99e93e731-kube-api-access-w2zgj\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.023752 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-apiservice-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.125271 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-webhook-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.125351 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zgj\" (UniqueName: \"kubernetes.io/projected/f5b08a58-e4f1-4520-aec9-e0f99e93e731-kube-api-access-w2zgj\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.125419 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-apiservice-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.139270 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-apiservice-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.139264 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5b08a58-e4f1-4520-aec9-e0f99e93e731-webhook-cert\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.149060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2zgj\" (UniqueName: \"kubernetes.io/projected/f5b08a58-e4f1-4520-aec9-e0f99e93e731-kube-api-access-w2zgj\") pod \"metallb-operator-controller-manager-5544dd96f7-h9gmp\" (UID: \"f5b08a58-e4f1-4520-aec9-e0f99e93e731\") " pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.156247 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h"] Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.157081 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.159349 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mk4lj" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.159688 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.159863 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.183443 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h"] Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.217177 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.330261 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4czg\" (UniqueName: \"kubernetes.io/projected/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-kube-api-access-x4czg\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.330515 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-webhook-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.330545 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-apiservice-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.431392 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4czg\" (UniqueName: \"kubernetes.io/projected/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-kube-api-access-x4czg\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.431478 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-webhook-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.431499 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-apiservice-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.435588 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-webhook-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.451030 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-apiservice-cert\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.451247 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4czg\" (UniqueName: \"kubernetes.io/projected/161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc-kube-api-access-x4czg\") pod \"metallb-operator-webhook-server-6959c5664d-69r4h\" (UID: \"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc\") " pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.506145 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.639608 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp"] Dec 05 00:40:36 crc kubenswrapper[4759]: W1205 00:40:36.666358 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b08a58_e4f1_4520_aec9_e0f99e93e731.slice/crio-7cc1ba076a1aafd7442f01b0c4fbb4be109d4484bdb7ccd4ecd2d34a52919d90 WatchSource:0}: Error finding container 7cc1ba076a1aafd7442f01b0c4fbb4be109d4484bdb7ccd4ecd2d34a52919d90: Status 404 returned error can't find the container with id 7cc1ba076a1aafd7442f01b0c4fbb4be109d4484bdb7ccd4ecd2d34a52919d90 Dec 05 00:40:36 crc kubenswrapper[4759]: I1205 00:40:36.993946 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h"] Dec 05 00:40:36 crc kubenswrapper[4759]: W1205 00:40:36.995640 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod161c1f00_7ae1_4d8e_8d03_48b55dd5a8cc.slice/crio-c7d9986eac783b91ef22ab8d7a3d078a7ebb26d450ca2b20589c579bebb00604 WatchSource:0}: Error finding container c7d9986eac783b91ef22ab8d7a3d078a7ebb26d450ca2b20589c579bebb00604: Status 404 returned error can't find the container with id c7d9986eac783b91ef22ab8d7a3d078a7ebb26d450ca2b20589c579bebb00604 Dec 05 00:40:37 crc kubenswrapper[4759]: I1205 00:40:37.268915 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" event={"ID":"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc","Type":"ContainerStarted","Data":"c7d9986eac783b91ef22ab8d7a3d078a7ebb26d450ca2b20589c579bebb00604"} Dec 05 00:40:37 crc kubenswrapper[4759]: I1205 00:40:37.270530 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" event={"ID":"f5b08a58-e4f1-4520-aec9-e0f99e93e731","Type":"ContainerStarted","Data":"7cc1ba076a1aafd7442f01b0c4fbb4be109d4484bdb7ccd4ecd2d34a52919d90"} Dec 05 00:40:40 crc kubenswrapper[4759]: I1205 00:40:40.293590 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" event={"ID":"f5b08a58-e4f1-4520-aec9-e0f99e93e731","Type":"ContainerStarted","Data":"db37111a2006fc5c54733de6090f05e385bbcc7a67cd7bddb3d455f7ec2a6312"} Dec 05 00:40:40 crc kubenswrapper[4759]: I1205 00:40:40.294298 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:40:40 crc kubenswrapper[4759]: I1205 00:40:40.324486 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" podStartSLOduration=2.014825939 podStartE2EDuration="5.324467767s" podCreationTimestamp="2025-12-05 00:40:35 +0000 UTC" firstStartedPulling="2025-12-05 00:40:36.668792982 +0000 UTC m=+1055.884453922" lastFinishedPulling="2025-12-05 00:40:39.9784348 +0000 UTC m=+1059.194095750" observedRunningTime="2025-12-05 00:40:40.312741387 +0000 UTC m=+1059.528402357" watchObservedRunningTime="2025-12-05 00:40:40.324467767 +0000 UTC m=+1059.540128717" Dec 05 00:40:43 crc kubenswrapper[4759]: I1205 00:40:43.316619 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" event={"ID":"161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc","Type":"ContainerStarted","Data":"d1117df3333fe4214de38aab0a6ef4bd151b64b301a6c8bd48c5ab76ed458927"} Dec 05 00:40:43 crc kubenswrapper[4759]: I1205 00:40:43.341402 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" podStartSLOduration=1.99467276 podStartE2EDuration="7.341383953s" podCreationTimestamp="2025-12-05 00:40:36 +0000 UTC" firstStartedPulling="2025-12-05 00:40:36.998239059 +0000 UTC m=+1056.213900019" lastFinishedPulling="2025-12-05 00:40:42.344950262 +0000 UTC m=+1061.560611212" observedRunningTime="2025-12-05 00:40:43.338532463 +0000 UTC m=+1062.554193433" watchObservedRunningTime="2025-12-05 00:40:43.341383953 +0000 UTC m=+1062.557044913" Dec 05 00:40:44 crc kubenswrapper[4759]: I1205 00:40:44.323816 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:40:56 crc kubenswrapper[4759]: I1205 00:40:56.515479 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6959c5664d-69r4h" Dec 05 00:41:04 crc kubenswrapper[4759]: I1205 00:41:04.433549 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:41:04 crc kubenswrapper[4759]: I1205 00:41:04.434279 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:41:16 crc kubenswrapper[4759]: I1205 00:41:16.220860 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5544dd96f7-h9gmp" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.124351 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2hbsc"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.127053 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.130849 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.131022 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.131410 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vh6tr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.137941 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.139430 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.144857 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.148588 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.222387 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wv8kr"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.223448 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.226127 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.226670 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-twbkp" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.226818 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.227368 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243415 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xxv\" (UniqueName: \"kubernetes.io/projected/b81e7b66-fed2-4b1e-8504-22a839862f14-kube-api-access-99xxv\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243462 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-sockets\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243499 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243595 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbsgs\" (UniqueName: \"kubernetes.io/projected/1343e9fd-38e9-4285-89a6-f4a15dfca396-kube-api-access-fbsgs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243657 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243728 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-reloader\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243819 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-startup\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.243964 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-conf\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.253786 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-dk9mn"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.255159 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.257366 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.258965 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-dk9mn"] Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345592 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345643 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metallb-excludel2\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345668 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbsgs\" (UniqueName: \"kubernetes.io/projected/1343e9fd-38e9-4285-89a6-f4a15dfca396-kube-api-access-fbsgs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345688 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345723 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345755 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-reloader\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345785 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-cert\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345804 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345821 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsd6d\" (UniqueName: \"kubernetes.io/projected/78f52cea-319f-4493-aa77-b97f1fed1583-kube-api-access-fsd6d\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345862 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-startup\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345882 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddb7g\" (UniqueName: \"kubernetes.io/projected/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-kube-api-access-ddb7g\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345903 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-conf\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345922 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xxv\" (UniqueName: \"kubernetes.io/projected/b81e7b66-fed2-4b1e-8504-22a839862f14-kube-api-access-99xxv\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345940 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345967 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-sockets\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.345987 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.346115 4759 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.346159 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs podName:1343e9fd-38e9-4285-89a6-f4a15dfca396 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:17.846144327 +0000 UTC m=+1097.061805277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs") pod "frr-k8s-2hbsc" (UID: "1343e9fd-38e9-4285-89a6-f4a15dfca396") : secret "frr-k8s-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.346701 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.346896 4759 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.346925 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert podName:b81e7b66-fed2-4b1e-8504-22a839862f14 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:17.846918067 +0000 UTC m=+1097.062579017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert") pod "frr-k8s-webhook-server-7fcb986d4-7kpbw" (UID: "b81e7b66-fed2-4b1e-8504-22a839862f14") : secret "frr-k8s-webhook-server-cert" not found Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.347105 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-reloader\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.347804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-startup\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.348010 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-conf\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.348796 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1343e9fd-38e9-4285-89a6-f4a15dfca396-frr-sockets\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.380050 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbsgs\" (UniqueName: \"kubernetes.io/projected/1343e9fd-38e9-4285-89a6-f4a15dfca396-kube-api-access-fbsgs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.381103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xxv\" (UniqueName: \"kubernetes.io/projected/b81e7b66-fed2-4b1e-8504-22a839862f14-kube-api-access-99xxv\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447763 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddb7g\" (UniqueName: \"kubernetes.io/projected/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-kube-api-access-ddb7g\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447833 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447867 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metallb-excludel2\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447893 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447953 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-cert\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447973 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.447993 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsd6d\" (UniqueName: \"kubernetes.io/projected/78f52cea-319f-4493-aa77-b97f1fed1583-kube-api-access-fsd6d\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.448500 4759 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.448564 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs podName:78f52cea-319f-4493-aa77-b97f1fed1583 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:17.948548423 +0000 UTC m=+1097.164209373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs") pod "controller-f8648f98b-dk9mn" (UID: "78f52cea-319f-4493-aa77-b97f1fed1583") : secret "controller-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.449386 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metallb-excludel2\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.449470 4759 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.449504 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs podName:b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:17.949495357 +0000 UTC m=+1097.165156307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs") pod "speaker-wv8kr" (UID: "b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51") : secret "speaker-certs-secret" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.449570 4759 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.449594 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist podName:b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:17.949587959 +0000 UTC m=+1097.165248909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist") pod "speaker-wv8kr" (UID: "b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51") : secret "metallb-memberlist" not found Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.450679 4759 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.463530 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-cert\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.471119 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddb7g\" (UniqueName: \"kubernetes.io/projected/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-kube-api-access-ddb7g\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.478616 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsd6d\" (UniqueName: \"kubernetes.io/projected/78f52cea-319f-4493-aa77-b97f1fed1583-kube-api-access-fsd6d\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.854110 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.854211 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.858362 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b81e7b66-fed2-4b1e-8504-22a839862f14-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7kpbw\" (UID: \"b81e7b66-fed2-4b1e-8504-22a839862f14\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.860968 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1343e9fd-38e9-4285-89a6-f4a15dfca396-metrics-certs\") pod \"frr-k8s-2hbsc\" (UID: \"1343e9fd-38e9-4285-89a6-f4a15dfca396\") " pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.955263 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.955392 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.955534 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.955710 4759 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 00:41:17 crc kubenswrapper[4759]: E1205 00:41:17.955774 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist podName:b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51 nodeName:}" failed. No retries permitted until 2025-12-05 00:41:18.955759376 +0000 UTC m=+1098.171420326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist") pod "speaker-wv8kr" (UID: "b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51") : secret "metallb-memberlist" not found Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.961398 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f52cea-319f-4493-aa77-b97f1fed1583-metrics-certs\") pod \"controller-f8648f98b-dk9mn\" (UID: \"78f52cea-319f-4493-aa77-b97f1fed1583\") " pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:17 crc kubenswrapper[4759]: I1205 00:41:17.962299 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-metrics-certs\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.055221 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.068405 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.170119 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.580816 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"448e04bd2ae4260a8f153da5722c2d93a9af26675c622e7c03d7b846ac9f5e92"} Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.600974 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw"] Dec 05 00:41:18 crc kubenswrapper[4759]: W1205 00:41:18.604330 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb81e7b66_fed2_4b1e_8504_22a839862f14.slice/crio-c6032fd571e3d68fbd41b1e099d8633dc4ac8e5c1671dac014d250f4239fbb4b WatchSource:0}: Error finding container c6032fd571e3d68fbd41b1e099d8633dc4ac8e5c1671dac014d250f4239fbb4b: Status 404 returned error can't find the container with id c6032fd571e3d68fbd41b1e099d8633dc4ac8e5c1671dac014d250f4239fbb4b Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.696653 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-dk9mn"] Dec 05 00:41:18 crc kubenswrapper[4759]: W1205 00:41:18.697984 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f52cea_319f_4493_aa77_b97f1fed1583.slice/crio-e201c05aa40892d118ba3059ba58f1d0c2b90b2c0384a92bf7fca48dac35e3b1 WatchSource:0}: Error finding container e201c05aa40892d118ba3059ba58f1d0c2b90b2c0384a92bf7fca48dac35e3b1: Status 404 returned error can't find the container with id e201c05aa40892d118ba3059ba58f1d0c2b90b2c0384a92bf7fca48dac35e3b1 Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.972887 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:18 crc kubenswrapper[4759]: I1205 00:41:18.978025 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51-memberlist\") pod \"speaker-wv8kr\" (UID: \"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51\") " pod="metallb-system/speaker-wv8kr" Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.038493 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wv8kr" Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.591218 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dk9mn" event={"ID":"78f52cea-319f-4493-aa77-b97f1fed1583","Type":"ContainerStarted","Data":"bce01cf6f24c6454a667b4ea271b5191610d84611bd539e07d003c74750c4663"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.591550 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dk9mn" event={"ID":"78f52cea-319f-4493-aa77-b97f1fed1583","Type":"ContainerStarted","Data":"1e2ffbbc807fcb341b19484bacaa9ebb369df4833da99da9ee6cde0730673a34"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.591568 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.591577 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-dk9mn" event={"ID":"78f52cea-319f-4493-aa77-b97f1fed1583","Type":"ContainerStarted","Data":"e201c05aa40892d118ba3059ba58f1d0c2b90b2c0384a92bf7fca48dac35e3b1"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.596191 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wv8kr" event={"ID":"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51","Type":"ContainerStarted","Data":"0efaf8c4e5467f5c1379c74e216855f37081b3967fb103651457bf0deed61787"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.596251 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wv8kr" event={"ID":"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51","Type":"ContainerStarted","Data":"eff9e2b9444cc1e656e7352853bc2091233ecabb7d727d2c67226a2307ebe7d6"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.596268 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wv8kr" event={"ID":"b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51","Type":"ContainerStarted","Data":"852f8cc957275470db158b2cd5ccf8fe494e42a0d7deb14dc2783325f3745e69"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.596433 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wv8kr" Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.598477 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" event={"ID":"b81e7b66-fed2-4b1e-8504-22a839862f14","Type":"ContainerStarted","Data":"c6032fd571e3d68fbd41b1e099d8633dc4ac8e5c1671dac014d250f4239fbb4b"} Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.614010 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-dk9mn" podStartSLOduration=2.613992585 podStartE2EDuration="2.613992585s" podCreationTimestamp="2025-12-05 00:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:41:19.608238983 +0000 UTC m=+1098.823899933" watchObservedRunningTime="2025-12-05 00:41:19.613992585 +0000 UTC m=+1098.829653535" Dec 05 00:41:19 crc kubenswrapper[4759]: I1205 00:41:19.630661 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wv8kr" podStartSLOduration=2.630645236 podStartE2EDuration="2.630645236s" podCreationTimestamp="2025-12-05 00:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:41:19.628619585 +0000 UTC m=+1098.844280535" watchObservedRunningTime="2025-12-05 00:41:19.630645236 +0000 UTC m=+1098.846306186" Dec 05 00:41:26 crc kubenswrapper[4759]: I1205 00:41:26.684413 4759 generic.go:334] "Generic (PLEG): container finished" podID="1343e9fd-38e9-4285-89a6-f4a15dfca396" containerID="f3e10e448a50a4f538474b13de31bc33d6b68b7c7716b90a4ae36aee361d6105" exitCode=0 Dec 05 00:41:26 crc kubenswrapper[4759]: I1205 00:41:26.684961 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerDied","Data":"f3e10e448a50a4f538474b13de31bc33d6b68b7c7716b90a4ae36aee361d6105"} Dec 05 00:41:26 crc kubenswrapper[4759]: I1205 00:41:26.693151 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" event={"ID":"b81e7b66-fed2-4b1e-8504-22a839862f14","Type":"ContainerStarted","Data":"1cbddb0ffe6b2099ce03e96d73800cd98aad601314b3421a7c4869042420249b"} Dec 05 00:41:26 crc kubenswrapper[4759]: I1205 00:41:26.693823 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:26 crc kubenswrapper[4759]: I1205 00:41:26.736479 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" podStartSLOduration=2.230468874 podStartE2EDuration="9.736461736s" podCreationTimestamp="2025-12-05 00:41:17 +0000 UTC" firstStartedPulling="2025-12-05 00:41:18.606163532 +0000 UTC m=+1097.821824482" lastFinishedPulling="2025-12-05 00:41:26.112156364 +0000 UTC m=+1105.327817344" observedRunningTime="2025-12-05 00:41:26.73259563 +0000 UTC m=+1105.948256600" watchObservedRunningTime="2025-12-05 00:41:26.736461736 +0000 UTC m=+1105.952122696" Dec 05 00:41:27 crc kubenswrapper[4759]: I1205 00:41:27.702798 4759 generic.go:334] "Generic (PLEG): container finished" podID="1343e9fd-38e9-4285-89a6-f4a15dfca396" containerID="1b67e6ae73dec057f3664abfa5523a7bbbf4692286f354e3c49628c76cc6a3ac" exitCode=0 Dec 05 00:41:27 crc kubenswrapper[4759]: I1205 00:41:27.702951 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerDied","Data":"1b67e6ae73dec057f3664abfa5523a7bbbf4692286f354e3c49628c76cc6a3ac"} Dec 05 00:41:28 crc kubenswrapper[4759]: I1205 00:41:28.174844 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-dk9mn" Dec 05 00:41:28 crc kubenswrapper[4759]: I1205 00:41:28.715341 4759 generic.go:334] "Generic (PLEG): container finished" podID="1343e9fd-38e9-4285-89a6-f4a15dfca396" containerID="e60663d9034ff1daabf8fdf9e9caab58384f67eddf27ed3eb92de30a408deb24" exitCode=0 Dec 05 00:41:28 crc kubenswrapper[4759]: I1205 00:41:28.715448 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerDied","Data":"e60663d9034ff1daabf8fdf9e9caab58384f67eddf27ed3eb92de30a408deb24"} Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.042866 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wv8kr" Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.726139 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"d76abcdc593e842ee72a63f60ecbfcc06be882ea0da7b9d9d4137e0dc8be60fc"} Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.726461 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"936e6213076648e25bb73adf4546b76398169312d4378e7bd3a91d8254585110"} Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.726471 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"57cb75284e1cad7a5637429b01cdeb282969e58cfbeb67564398768f941a5921"} Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.726480 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"a0700596a66bd34af13fab732159603cd677ab90aa1eee552112a652207df806"} Dec 05 00:41:29 crc kubenswrapper[4759]: I1205 00:41:29.726488 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"3abea8337ea7b177c0c5eb5169e3c55114d58585c31550038467770790be71fd"} Dec 05 00:41:30 crc kubenswrapper[4759]: I1205 00:41:30.740856 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2hbsc" event={"ID":"1343e9fd-38e9-4285-89a6-f4a15dfca396","Type":"ContainerStarted","Data":"3fd43c6f540112d134da8cc70af86f2745a7c6cd631a769579b55dc2a5567bd1"} Dec 05 00:41:30 crc kubenswrapper[4759]: I1205 00:41:30.741116 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.357488 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2hbsc" podStartSLOduration=7.567313772 podStartE2EDuration="15.357463174s" podCreationTimestamp="2025-12-05 00:41:17 +0000 UTC" firstStartedPulling="2025-12-05 00:41:18.296503663 +0000 UTC m=+1097.512164613" lastFinishedPulling="2025-12-05 00:41:26.086653055 +0000 UTC m=+1105.302314015" observedRunningTime="2025-12-05 00:41:30.772846403 +0000 UTC m=+1109.988507393" watchObservedRunningTime="2025-12-05 00:41:32.357463174 +0000 UTC m=+1111.573124164" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.362431 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.363828 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.366258 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.366554 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.366661 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kcnnj" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.374639 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.423215 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq942\" (UniqueName: \"kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942\") pod \"openstack-operator-index-72d54\" (UID: \"33b83c28-8814-4e95-bb39-268909f393f7\") " pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.524927 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq942\" (UniqueName: \"kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942\") pod \"openstack-operator-index-72d54\" (UID: \"33b83c28-8814-4e95-bb39-268909f393f7\") " pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.542565 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq942\" (UniqueName: \"kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942\") pod \"openstack-operator-index-72d54\" (UID: \"33b83c28-8814-4e95-bb39-268909f393f7\") " pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:32 crc kubenswrapper[4759]: I1205 00:41:32.685641 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:33 crc kubenswrapper[4759]: I1205 00:41:33.055967 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:33 crc kubenswrapper[4759]: I1205 00:41:33.092085 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:33 crc kubenswrapper[4759]: I1205 00:41:33.670184 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:33 crc kubenswrapper[4759]: W1205 00:41:33.675756 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b83c28_8814_4e95_bb39_268909f393f7.slice/crio-911b54a3cd8e33e7f1cd8a6ef89723a73a7f62193f8b97baf999d0c8580acccb WatchSource:0}: Error finding container 911b54a3cd8e33e7f1cd8a6ef89723a73a7f62193f8b97baf999d0c8580acccb: Status 404 returned error can't find the container with id 911b54a3cd8e33e7f1cd8a6ef89723a73a7f62193f8b97baf999d0c8580acccb Dec 05 00:41:33 crc kubenswrapper[4759]: I1205 00:41:33.778354 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72d54" event={"ID":"33b83c28-8814-4e95-bb39-268909f393f7","Type":"ContainerStarted","Data":"911b54a3cd8e33e7f1cd8a6ef89723a73a7f62193f8b97baf999d0c8580acccb"} Dec 05 00:41:34 crc kubenswrapper[4759]: I1205 00:41:34.434173 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:41:34 crc kubenswrapper[4759]: I1205 00:41:34.434259 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:41:36 crc kubenswrapper[4759]: I1205 00:41:36.526521 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.353728 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lcpcr"] Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.367776 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.377508 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lcpcr"] Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.416376 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmbf\" (UniqueName: \"kubernetes.io/projected/627231bc-7c87-4c95-9a7e-ca5c295bfc69-kube-api-access-shmbf\") pod \"openstack-operator-index-lcpcr\" (UID: \"627231bc-7c87-4c95-9a7e-ca5c295bfc69\") " pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.518162 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmbf\" (UniqueName: \"kubernetes.io/projected/627231bc-7c87-4c95-9a7e-ca5c295bfc69-kube-api-access-shmbf\") pod \"openstack-operator-index-lcpcr\" (UID: \"627231bc-7c87-4c95-9a7e-ca5c295bfc69\") " pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.552609 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmbf\" (UniqueName: \"kubernetes.io/projected/627231bc-7c87-4c95-9a7e-ca5c295bfc69-kube-api-access-shmbf\") pod \"openstack-operator-index-lcpcr\" (UID: \"627231bc-7c87-4c95-9a7e-ca5c295bfc69\") " pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.705396 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.819970 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72d54" event={"ID":"33b83c28-8814-4e95-bb39-268909f393f7","Type":"ContainerStarted","Data":"7ed5f943034f091e79b3c5d971394b5b8e223dec56e7f28bedd72627ed995547"} Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.821176 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-72d54" podUID="33b83c28-8814-4e95-bb39-268909f393f7" containerName="registry-server" containerID="cri-o://7ed5f943034f091e79b3c5d971394b5b8e223dec56e7f28bedd72627ed995547" gracePeriod=2 Dec 05 00:41:37 crc kubenswrapper[4759]: I1205 00:41:37.857779 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-72d54" podStartSLOduration=2.561067076 podStartE2EDuration="5.857752776s" podCreationTimestamp="2025-12-05 00:41:32 +0000 UTC" firstStartedPulling="2025-12-05 00:41:33.678039532 +0000 UTC m=+1112.893700492" lastFinishedPulling="2025-12-05 00:41:36.974725242 +0000 UTC m=+1116.190386192" observedRunningTime="2025-12-05 00:41:37.846463688 +0000 UTC m=+1117.062124668" watchObservedRunningTime="2025-12-05 00:41:37.857752776 +0000 UTC m=+1117.073413736" Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.006340 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lcpcr"] Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.059953 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2hbsc" Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.073437 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7kpbw" Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.839766 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lcpcr" event={"ID":"627231bc-7c87-4c95-9a7e-ca5c295bfc69","Type":"ContainerStarted","Data":"bdd161cd18ad19ea17cf9410005e5f951db29c46c0ae67733b4d7ef60e8bea69"} Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.843162 4759 generic.go:334] "Generic (PLEG): container finished" podID="33b83c28-8814-4e95-bb39-268909f393f7" containerID="7ed5f943034f091e79b3c5d971394b5b8e223dec56e7f28bedd72627ed995547" exitCode=0 Dec 05 00:41:38 crc kubenswrapper[4759]: I1205 00:41:38.843221 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72d54" event={"ID":"33b83c28-8814-4e95-bb39-268909f393f7","Type":"ContainerDied","Data":"7ed5f943034f091e79b3c5d971394b5b8e223dec56e7f28bedd72627ed995547"} Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.062987 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.144480 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq942\" (UniqueName: \"kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942\") pod \"33b83c28-8814-4e95-bb39-268909f393f7\" (UID: \"33b83c28-8814-4e95-bb39-268909f393f7\") " Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.149356 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942" (OuterVolumeSpecName: "kube-api-access-sq942") pod "33b83c28-8814-4e95-bb39-268909f393f7" (UID: "33b83c28-8814-4e95-bb39-268909f393f7"). InnerVolumeSpecName "kube-api-access-sq942". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.246603 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq942\" (UniqueName: \"kubernetes.io/projected/33b83c28-8814-4e95-bb39-268909f393f7-kube-api-access-sq942\") on node \"crc\" DevicePath \"\"" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.855328 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lcpcr" event={"ID":"627231bc-7c87-4c95-9a7e-ca5c295bfc69","Type":"ContainerStarted","Data":"dc53ca45a17fe81564db697855293ba152530035161fe6f94d5f76872f2b7687"} Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.857983 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72d54" event={"ID":"33b83c28-8814-4e95-bb39-268909f393f7","Type":"ContainerDied","Data":"911b54a3cd8e33e7f1cd8a6ef89723a73a7f62193f8b97baf999d0c8580acccb"} Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.858075 4759 scope.go:117] "RemoveContainer" containerID="7ed5f943034f091e79b3c5d971394b5b8e223dec56e7f28bedd72627ed995547" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.858301 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72d54" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.916456 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lcpcr" podStartSLOduration=2.2900553009999998 podStartE2EDuration="2.916433433s" podCreationTimestamp="2025-12-05 00:41:37 +0000 UTC" firstStartedPulling="2025-12-05 00:41:38.013231132 +0000 UTC m=+1117.228892082" lastFinishedPulling="2025-12-05 00:41:38.639609264 +0000 UTC m=+1117.855270214" observedRunningTime="2025-12-05 00:41:39.912839404 +0000 UTC m=+1119.128500364" watchObservedRunningTime="2025-12-05 00:41:39.916433433 +0000 UTC m=+1119.132094403" Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.944349 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:39 crc kubenswrapper[4759]: I1205 00:41:39.949528 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-72d54"] Dec 05 00:41:41 crc kubenswrapper[4759]: I1205 00:41:41.171485 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b83c28-8814-4e95-bb39-268909f393f7" path="/var/lib/kubelet/pods/33b83c28-8814-4e95-bb39-268909f393f7/volumes" Dec 05 00:41:47 crc kubenswrapper[4759]: I1205 00:41:47.705716 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:47 crc kubenswrapper[4759]: I1205 00:41:47.706531 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:47 crc kubenswrapper[4759]: I1205 00:41:47.757541 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:47 crc kubenswrapper[4759]: I1205 00:41:47.976730 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lcpcr" Dec 05 00:41:51 crc kubenswrapper[4759]: I1205 00:41:51.988000 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4"] Dec 05 00:41:51 crc kubenswrapper[4759]: E1205 00:41:51.988808 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b83c28-8814-4e95-bb39-268909f393f7" containerName="registry-server" Dec 05 00:41:51 crc kubenswrapper[4759]: I1205 00:41:51.988831 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b83c28-8814-4e95-bb39-268909f393f7" containerName="registry-server" Dec 05 00:41:51 crc kubenswrapper[4759]: I1205 00:41:51.989060 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b83c28-8814-4e95-bb39-268909f393f7" containerName="registry-server" Dec 05 00:41:51 crc kubenswrapper[4759]: I1205 00:41:51.990715 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:51 crc kubenswrapper[4759]: I1205 00:41:51.994167 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-m65qw" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.008980 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4"] Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.048512 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.048684 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54jpr\" (UniqueName: \"kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.048824 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.150301 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.150549 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.150629 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54jpr\" (UniqueName: \"kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.151271 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.151437 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.179358 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54jpr\" (UniqueName: \"kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr\") pod \"1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.321946 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.868684 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4"] Dec 05 00:41:52 crc kubenswrapper[4759]: W1205 00:41:52.877171 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbf91d7_0670_400f_92fe_30da90f6e105.slice/crio-c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a WatchSource:0}: Error finding container c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a: Status 404 returned error can't find the container with id c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a Dec 05 00:41:52 crc kubenswrapper[4759]: I1205 00:41:52.982029 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" event={"ID":"8fbf91d7-0670-400f-92fe-30da90f6e105","Type":"ContainerStarted","Data":"c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a"} Dec 05 00:41:54 crc kubenswrapper[4759]: I1205 00:41:53.999123 4759 generic.go:334] "Generic (PLEG): container finished" podID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerID="829e193ed7e84d6836541150806da5d668c861f212219259d75d45f1f7b96a5a" exitCode=0 Dec 05 00:41:54 crc kubenswrapper[4759]: I1205 00:41:53.999612 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" event={"ID":"8fbf91d7-0670-400f-92fe-30da90f6e105","Type":"ContainerDied","Data":"829e193ed7e84d6836541150806da5d668c861f212219259d75d45f1f7b96a5a"} Dec 05 00:41:55 crc kubenswrapper[4759]: I1205 00:41:55.013195 4759 generic.go:334] "Generic (PLEG): container finished" podID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerID="f096a15101d0e138404e5731d7efaaa2d38fe9fa36748df1b14e7bcd150dfb0c" exitCode=0 Dec 05 00:41:55 crc kubenswrapper[4759]: I1205 00:41:55.013283 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" event={"ID":"8fbf91d7-0670-400f-92fe-30da90f6e105","Type":"ContainerDied","Data":"f096a15101d0e138404e5731d7efaaa2d38fe9fa36748df1b14e7bcd150dfb0c"} Dec 05 00:41:56 crc kubenswrapper[4759]: I1205 00:41:56.026363 4759 generic.go:334] "Generic (PLEG): container finished" podID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerID="c6f8d06fc7ad75f48805e1ffb7b9bb65fff5fb41f7e5de9b08c92cbebdcfcbc3" exitCode=0 Dec 05 00:41:56 crc kubenswrapper[4759]: I1205 00:41:56.026519 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" event={"ID":"8fbf91d7-0670-400f-92fe-30da90f6e105","Type":"ContainerDied","Data":"c6f8d06fc7ad75f48805e1ffb7b9bb65fff5fb41f7e5de9b08c92cbebdcfcbc3"} Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.330451 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.440814 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54jpr\" (UniqueName: \"kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr\") pod \"8fbf91d7-0670-400f-92fe-30da90f6e105\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.440942 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util\") pod \"8fbf91d7-0670-400f-92fe-30da90f6e105\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.441002 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle\") pod \"8fbf91d7-0670-400f-92fe-30da90f6e105\" (UID: \"8fbf91d7-0670-400f-92fe-30da90f6e105\") " Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.441953 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle" (OuterVolumeSpecName: "bundle") pod "8fbf91d7-0670-400f-92fe-30da90f6e105" (UID: "8fbf91d7-0670-400f-92fe-30da90f6e105"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.450202 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr" (OuterVolumeSpecName: "kube-api-access-54jpr") pod "8fbf91d7-0670-400f-92fe-30da90f6e105" (UID: "8fbf91d7-0670-400f-92fe-30da90f6e105"). InnerVolumeSpecName "kube-api-access-54jpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.460019 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util" (OuterVolumeSpecName: "util") pod "8fbf91d7-0670-400f-92fe-30da90f6e105" (UID: "8fbf91d7-0670-400f-92fe-30da90f6e105"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.543172 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54jpr\" (UniqueName: \"kubernetes.io/projected/8fbf91d7-0670-400f-92fe-30da90f6e105-kube-api-access-54jpr\") on node \"crc\" DevicePath \"\"" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.543233 4759 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-util\") on node \"crc\" DevicePath \"\"" Dec 05 00:41:57 crc kubenswrapper[4759]: I1205 00:41:57.543254 4759 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8fbf91d7-0670-400f-92fe-30da90f6e105-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:41:58 crc kubenswrapper[4759]: I1205 00:41:58.043891 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" Dec 05 00:41:58 crc kubenswrapper[4759]: I1205 00:41:58.043879 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4" event={"ID":"8fbf91d7-0670-400f-92fe-30da90f6e105","Type":"ContainerDied","Data":"c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a"} Dec 05 00:41:58 crc kubenswrapper[4759]: I1205 00:41:58.044078 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d2ecf3fa1ee89e1fb39f1e9efb0cb8f15e805baf0e35775f8369450d85fd0a" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.118812 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2"] Dec 05 00:42:04 crc kubenswrapper[4759]: E1205 00:42:04.119460 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="pull" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.119472 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="pull" Dec 05 00:42:04 crc kubenswrapper[4759]: E1205 00:42:04.119488 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="util" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.119494 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="util" Dec 05 00:42:04 crc kubenswrapper[4759]: E1205 00:42:04.119507 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="extract" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.119513 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="extract" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.119648 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbf91d7-0670-400f-92fe-30da90f6e105" containerName="extract" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.120119 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.122912 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-lpx2k" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.211571 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2"] Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.255221 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs8m6\" (UniqueName: \"kubernetes.io/projected/c02805c1-2950-4e50-9163-a3ca8d5c4319-kube-api-access-cs8m6\") pod \"openstack-operator-controller-operator-7c496d6cb7-z59q2\" (UID: \"c02805c1-2950-4e50-9163-a3ca8d5c4319\") " pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.356802 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs8m6\" (UniqueName: \"kubernetes.io/projected/c02805c1-2950-4e50-9163-a3ca8d5c4319-kube-api-access-cs8m6\") pod \"openstack-operator-controller-operator-7c496d6cb7-z59q2\" (UID: \"c02805c1-2950-4e50-9163-a3ca8d5c4319\") " pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.374036 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs8m6\" (UniqueName: \"kubernetes.io/projected/c02805c1-2950-4e50-9163-a3ca8d5c4319-kube-api-access-cs8m6\") pod \"openstack-operator-controller-operator-7c496d6cb7-z59q2\" (UID: \"c02805c1-2950-4e50-9163-a3ca8d5c4319\") " pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.433875 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.433980 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.434058 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.435098 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.435210 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce" gracePeriod=600 Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.443274 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:04 crc kubenswrapper[4759]: I1205 00:42:04.923706 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2"] Dec 05 00:42:04 crc kubenswrapper[4759]: W1205 00:42:04.930914 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02805c1_2950_4e50_9163_a3ca8d5c4319.slice/crio-0dc01b357239111d119b26452724379a9c78df0988325172f6d65370c9f8dc7d WatchSource:0}: Error finding container 0dc01b357239111d119b26452724379a9c78df0988325172f6d65370c9f8dc7d: Status 404 returned error can't find the container with id 0dc01b357239111d119b26452724379a9c78df0988325172f6d65370c9f8dc7d Dec 05 00:42:05 crc kubenswrapper[4759]: I1205 00:42:05.105109 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" event={"ID":"c02805c1-2950-4e50-9163-a3ca8d5c4319","Type":"ContainerStarted","Data":"0dc01b357239111d119b26452724379a9c78df0988325172f6d65370c9f8dc7d"} Dec 05 00:42:05 crc kubenswrapper[4759]: I1205 00:42:05.107056 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce" exitCode=0 Dec 05 00:42:05 crc kubenswrapper[4759]: I1205 00:42:05.107089 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce"} Dec 05 00:42:05 crc kubenswrapper[4759]: I1205 00:42:05.107109 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae"} Dec 05 00:42:05 crc kubenswrapper[4759]: I1205 00:42:05.107124 4759 scope.go:117] "RemoveContainer" containerID="d9dc561c03abbf48d110afab06130c27d9acde9e46805bcdeb203f6dfe142c6b" Dec 05 00:42:10 crc kubenswrapper[4759]: I1205 00:42:10.157485 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" event={"ID":"c02805c1-2950-4e50-9163-a3ca8d5c4319","Type":"ContainerStarted","Data":"6e7240a9ca2bea61f1fdbd1df4b860f1b197cd9a38dcc81aabe82b5cf7ba9e3b"} Dec 05 00:42:10 crc kubenswrapper[4759]: I1205 00:42:10.158208 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:10 crc kubenswrapper[4759]: I1205 00:42:10.200455 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" podStartSLOduration=1.566511145 podStartE2EDuration="6.200423554s" podCreationTimestamp="2025-12-05 00:42:04 +0000 UTC" firstStartedPulling="2025-12-05 00:42:04.933146371 +0000 UTC m=+1144.148807321" lastFinishedPulling="2025-12-05 00:42:09.56705875 +0000 UTC m=+1148.782719730" observedRunningTime="2025-12-05 00:42:10.193423742 +0000 UTC m=+1149.409084752" watchObservedRunningTime="2025-12-05 00:42:10.200423554 +0000 UTC m=+1149.416084564" Dec 05 00:42:14 crc kubenswrapper[4759]: I1205 00:42:14.446938 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7c496d6cb7-z59q2" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.866473 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.868647 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.882713 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-45pjf" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.884601 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.885857 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.888180 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-v4h4p" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.893981 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.901915 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.912520 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.913918 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.917589 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rl6pf" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.931185 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.938811 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.940904 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.951525 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.952581 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.954134 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lkbkk" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.958618 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4sl4n" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.959724 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfnk\" (UniqueName: \"kubernetes.io/projected/5e28b15c-c39e-463a-b9a2-6f6df5addaf8-kube-api-access-bqfnk\") pod \"barbican-operator-controller-manager-7d9dfd778-ztxcj\" (UID: \"5e28b15c-c39e-463a-b9a2-6f6df5addaf8\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.959771 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jxl\" (UniqueName: \"kubernetes.io/projected/310627fe-09af-4a51-8312-e2b3841d6634-kube-api-access-w9jxl\") pod \"cinder-operator-controller-manager-859b6ccc6-zczgr\" (UID: \"310627fe-09af-4a51-8312-e2b3841d6634\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.970625 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.981529 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.982607 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.984109 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-srbkn" Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.988395 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn"] Dec 05 00:42:33 crc kubenswrapper[4759]: I1205 00:42:33.997173 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.015825 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.016929 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.020582 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.021868 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5j9x9" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.022491 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.039413 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.041080 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.053478 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.055138 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.059426 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060568 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfnk\" (UniqueName: \"kubernetes.io/projected/5e28b15c-c39e-463a-b9a2-6f6df5addaf8-kube-api-access-bqfnk\") pod \"barbican-operator-controller-manager-7d9dfd778-ztxcj\" (UID: \"5e28b15c-c39e-463a-b9a2-6f6df5addaf8\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5t8l\" (UniqueName: \"kubernetes.io/projected/f0212865-8c85-4b7c-855c-baa0fc705bf8-kube-api-access-b5t8l\") pod \"heat-operator-controller-manager-5f64f6f8bb-lrxdn\" (UID: \"f0212865-8c85-4b7c-855c-baa0fc705bf8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060663 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wvh\" (UniqueName: \"kubernetes.io/projected/7377061e-a243-49b5-9728-4aaa2462445e-kube-api-access-j7wvh\") pod \"glance-operator-controller-manager-77987cd8cd-s6jzr\" (UID: \"7377061e-a243-49b5-9728-4aaa2462445e\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060691 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jxl\" (UniqueName: \"kubernetes.io/projected/310627fe-09af-4a51-8312-e2b3841d6634-kube-api-access-w9jxl\") pod \"cinder-operator-controller-manager-859b6ccc6-zczgr\" (UID: \"310627fe-09af-4a51-8312-e2b3841d6634\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060721 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbx7x\" (UniqueName: \"kubernetes.io/projected/faf33139-8ab8-400c-8a2a-bf746d11f7e7-kube-api-access-qbx7x\") pod \"designate-operator-controller-manager-78b4bc895b-6gh82\" (UID: \"faf33139-8ab8-400c-8a2a-bf746d11f7e7\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.060746 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7ww\" (UniqueName: \"kubernetes.io/projected/a712ae8c-434d-43f7-ab4d-b385eee4eabf-kube-api-access-cb7ww\") pod \"horizon-operator-controller-manager-68c6d99b8f-jrlg9\" (UID: \"a712ae8c-434d-43f7-ab4d-b385eee4eabf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.072895 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f7brv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.073080 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5drhq" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.073196 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.074676 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.079757 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-cw5kl" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.081020 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.082189 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.087653 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-f8gcf" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.087697 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.099366 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.106827 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.110072 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jxl\" (UniqueName: \"kubernetes.io/projected/310627fe-09af-4a51-8312-e2b3841d6634-kube-api-access-w9jxl\") pod \"cinder-operator-controller-manager-859b6ccc6-zczgr\" (UID: \"310627fe-09af-4a51-8312-e2b3841d6634\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.113956 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfnk\" (UniqueName: \"kubernetes.io/projected/5e28b15c-c39e-463a-b9a2-6f6df5addaf8-kube-api-access-bqfnk\") pod \"barbican-operator-controller-manager-7d9dfd778-ztxcj\" (UID: \"5e28b15c-c39e-463a-b9a2-6f6df5addaf8\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.140381 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.141854 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.150825 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mwhq5" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.162674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdsts\" (UniqueName: \"kubernetes.io/projected/1cce914e-3baa-4146-a52c-e054ee0c1eed-kube-api-access-zdsts\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.162728 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrcc\" (UniqueName: \"kubernetes.io/projected/f07c10f0-5bec-4421-8ff0-2c659e42377b-kube-api-access-vrrcc\") pod \"ironic-operator-controller-manager-6c548fd776-7s2r8\" (UID: \"f07c10f0-5bec-4421-8ff0-2c659e42377b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.162775 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krjkw\" (UniqueName: \"kubernetes.io/projected/1626dead-b9fd-4fae-af93-e2332112626f-kube-api-access-krjkw\") pod \"keystone-operator-controller-manager-7765d96ddf-5h7t4\" (UID: \"1626dead-b9fd-4fae-af93-e2332112626f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.162828 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hq2p\" (UniqueName: \"kubernetes.io/projected/cbc0cab7-b730-4ada-994d-eb8ae2e014df-kube-api-access-8hq2p\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ckxbs\" (UID: \"cbc0cab7-b730-4ada-994d-eb8ae2e014df\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168470 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168563 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5t8l\" (UniqueName: \"kubernetes.io/projected/f0212865-8c85-4b7c-855c-baa0fc705bf8-kube-api-access-b5t8l\") pod \"heat-operator-controller-manager-5f64f6f8bb-lrxdn\" (UID: \"f0212865-8c85-4b7c-855c-baa0fc705bf8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168607 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wvh\" (UniqueName: \"kubernetes.io/projected/7377061e-a243-49b5-9728-4aaa2462445e-kube-api-access-j7wvh\") pod \"glance-operator-controller-manager-77987cd8cd-s6jzr\" (UID: \"7377061e-a243-49b5-9728-4aaa2462445e\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168641 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbx7x\" (UniqueName: \"kubernetes.io/projected/faf33139-8ab8-400c-8a2a-bf746d11f7e7-kube-api-access-qbx7x\") pod \"designate-operator-controller-manager-78b4bc895b-6gh82\" (UID: \"faf33139-8ab8-400c-8a2a-bf746d11f7e7\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168668 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7ww\" (UniqueName: \"kubernetes.io/projected/a712ae8c-434d-43f7-ab4d-b385eee4eabf-kube-api-access-cb7ww\") pod \"horizon-operator-controller-manager-68c6d99b8f-jrlg9\" (UID: \"a712ae8c-434d-43f7-ab4d-b385eee4eabf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.168727 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6jck\" (UniqueName: \"kubernetes.io/projected/785b512b-7fa8-4480-b042-3811f10e3659-kube-api-access-l6jck\") pod \"manila-operator-controller-manager-7c79b5df47-zh2wq\" (UID: \"785b512b-7fa8-4480-b042-3811f10e3659\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.189755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.197714 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5t8l\" (UniqueName: \"kubernetes.io/projected/f0212865-8c85-4b7c-855c-baa0fc705bf8-kube-api-access-b5t8l\") pod \"heat-operator-controller-manager-5f64f6f8bb-lrxdn\" (UID: \"f0212865-8c85-4b7c-855c-baa0fc705bf8\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.203247 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.280990 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7ww\" (UniqueName: \"kubernetes.io/projected/a712ae8c-434d-43f7-ab4d-b385eee4eabf-kube-api-access-cb7ww\") pod \"horizon-operator-controller-manager-68c6d99b8f-jrlg9\" (UID: \"a712ae8c-434d-43f7-ab4d-b385eee4eabf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.281456 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbx7x\" (UniqueName: \"kubernetes.io/projected/faf33139-8ab8-400c-8a2a-bf746d11f7e7-kube-api-access-qbx7x\") pod \"designate-operator-controller-manager-78b4bc895b-6gh82\" (UID: \"faf33139-8ab8-400c-8a2a-bf746d11f7e7\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.281825 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdsts\" (UniqueName: \"kubernetes.io/projected/1cce914e-3baa-4146-a52c-e054ee0c1eed-kube-api-access-zdsts\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.281866 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrcc\" (UniqueName: \"kubernetes.io/projected/f07c10f0-5bec-4421-8ff0-2c659e42377b-kube-api-access-vrrcc\") pod \"ironic-operator-controller-manager-6c548fd776-7s2r8\" (UID: \"f07c10f0-5bec-4421-8ff0-2c659e42377b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.281923 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krjkw\" (UniqueName: \"kubernetes.io/projected/1626dead-b9fd-4fae-af93-e2332112626f-kube-api-access-krjkw\") pod \"keystone-operator-controller-manager-7765d96ddf-5h7t4\" (UID: \"1626dead-b9fd-4fae-af93-e2332112626f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.281975 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hq2p\" (UniqueName: \"kubernetes.io/projected/cbc0cab7-b730-4ada-994d-eb8ae2e014df-kube-api-access-8hq2p\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ckxbs\" (UID: \"cbc0cab7-b730-4ada-994d-eb8ae2e014df\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.282014 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47jd\" (UniqueName: \"kubernetes.io/projected/2dcdddec-138e-46fd-ab1d-15e4c4a06a15-kube-api-access-t47jd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-p5g9w\" (UID: \"2dcdddec-138e-46fd-ab1d-15e4c4a06a15\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.282050 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.282109 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6jck\" (UniqueName: \"kubernetes.io/projected/785b512b-7fa8-4480-b042-3811f10e3659-kube-api-access-l6jck\") pod \"manila-operator-controller-manager-7c79b5df47-zh2wq\" (UID: \"785b512b-7fa8-4480-b042-3811f10e3659\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.286958 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.287032 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:42:34.787014842 +0000 UTC m=+1174.002675792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.298645 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.307827 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.320021 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.332980 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.371180 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdsts\" (UniqueName: \"kubernetes.io/projected/1cce914e-3baa-4146-a52c-e054ee0c1eed-kube-api-access-zdsts\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.371227 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wvh\" (UniqueName: \"kubernetes.io/projected/7377061e-a243-49b5-9728-4aaa2462445e-kube-api-access-j7wvh\") pod \"glance-operator-controller-manager-77987cd8cd-s6jzr\" (UID: \"7377061e-a243-49b5-9728-4aaa2462445e\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.375779 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6jck\" (UniqueName: \"kubernetes.io/projected/785b512b-7fa8-4480-b042-3811f10e3659-kube-api-access-l6jck\") pod \"manila-operator-controller-manager-7c79b5df47-zh2wq\" (UID: \"785b512b-7fa8-4480-b042-3811f10e3659\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.375820 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrcc\" (UniqueName: \"kubernetes.io/projected/f07c10f0-5bec-4421-8ff0-2c659e42377b-kube-api-access-vrrcc\") pod \"ironic-operator-controller-manager-6c548fd776-7s2r8\" (UID: \"f07c10f0-5bec-4421-8ff0-2c659e42377b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.380466 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.381413 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hq2p\" (UniqueName: \"kubernetes.io/projected/cbc0cab7-b730-4ada-994d-eb8ae2e014df-kube-api-access-8hq2p\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ckxbs\" (UID: \"cbc0cab7-b730-4ada-994d-eb8ae2e014df\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.402655 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47jd\" (UniqueName: \"kubernetes.io/projected/2dcdddec-138e-46fd-ab1d-15e4c4a06a15-kube-api-access-t47jd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-p5g9w\" (UID: \"2dcdddec-138e-46fd-ab1d-15e4c4a06a15\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.403698 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krjkw\" (UniqueName: \"kubernetes.io/projected/1626dead-b9fd-4fae-af93-e2332112626f-kube-api-access-krjkw\") pod \"keystone-operator-controller-manager-7765d96ddf-5h7t4\" (UID: \"1626dead-b9fd-4fae-af93-e2332112626f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.404091 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.404543 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.412845 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k4pvv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.429368 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.479778 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.497894 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47jd\" (UniqueName: \"kubernetes.io/projected/2dcdddec-138e-46fd-ab1d-15e4c4a06a15-kube-api-access-t47jd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-p5g9w\" (UID: \"2dcdddec-138e-46fd-ab1d-15e4c4a06a15\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.499082 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.499969 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.500475 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.505164 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpld\" (UniqueName: \"kubernetes.io/projected/976da4b0-9b83-4ffe-9cf2-a07c3e149e04-kube-api-access-2jpld\") pod \"nova-operator-controller-manager-697bc559fc-dcs2p\" (UID: \"976da4b0-9b83-4ffe-9cf2-a07c3e149e04\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.523952 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.524614 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bl7ft" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.533085 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.537206 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.539640 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c9xmb" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.552710 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.553943 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.556712 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zl5cg" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.581071 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.582930 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.599871 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.605473 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-f9b75"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.606825 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.607481 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsn95\" (UniqueName: \"kubernetes.io/projected/9894d9a4-5121-4345-ab9c-4f770f4e4bb0-kube-api-access-wsn95\") pod \"octavia-operator-controller-manager-998648c74-dwt5n\" (UID: \"9894d9a4-5121-4345-ab9c-4f770f4e4bb0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.607526 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpttr\" (UniqueName: \"kubernetes.io/projected/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-kube-api-access-wpttr\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.607557 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.607661 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w25f\" (UniqueName: \"kubernetes.io/projected/0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da-kube-api-access-2w25f\") pod \"ovn-operator-controller-manager-b6456fdb6-j587r\" (UID: \"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.607711 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jpld\" (UniqueName: \"kubernetes.io/projected/976da4b0-9b83-4ffe-9cf2-a07c3e149e04-kube-api-access-2jpld\") pod \"nova-operator-controller-manager-697bc559fc-dcs2p\" (UID: \"976da4b0-9b83-4ffe-9cf2-a07c3e149e04\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.609684 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nbzbt" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.610120 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.616381 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.642648 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.643927 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.645727 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jpld\" (UniqueName: \"kubernetes.io/projected/976da4b0-9b83-4ffe-9cf2-a07c3e149e04-kube-api-access-2jpld\") pod \"nova-operator-controller-manager-697bc559fc-dcs2p\" (UID: \"976da4b0-9b83-4ffe-9cf2-a07c3e149e04\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.651919 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-d4g4w" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.666638 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-f9b75"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.678283 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.700673 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.702219 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.707846 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dsz2d" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.712239 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714150 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bzd5\" (UniqueName: \"kubernetes.io/projected/da012733-6903-4607-9be5-17c81d20ae6b-kube-api-access-8bzd5\") pod \"swift-operator-controller-manager-5f8c65bbfc-qxf6v\" (UID: \"da012733-6903-4607-9be5-17c81d20ae6b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714236 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkln\" (UniqueName: \"kubernetes.io/projected/f35362c5-4886-42dc-a633-c018e7f6aaf2-kube-api-access-6tkln\") pod \"placement-operator-controller-manager-78f8948974-f9b75\" (UID: \"f35362c5-4886-42dc-a633-c018e7f6aaf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714284 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsn95\" (UniqueName: \"kubernetes.io/projected/9894d9a4-5121-4345-ab9c-4f770f4e4bb0-kube-api-access-wsn95\") pod \"octavia-operator-controller-manager-998648c74-dwt5n\" (UID: \"9894d9a4-5121-4345-ab9c-4f770f4e4bb0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714339 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpttr\" (UniqueName: \"kubernetes.io/projected/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-kube-api-access-wpttr\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714367 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.714393 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w25f\" (UniqueName: \"kubernetes.io/projected/0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da-kube-api-access-2w25f\") pod \"ovn-operator-controller-manager-b6456fdb6-j587r\" (UID: \"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.715633 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.715701 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:42:35.215681099 +0000 UTC m=+1174.431342049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.729833 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.743426 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.747977 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.751238 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fxmv2" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.751273 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpttr\" (UniqueName: \"kubernetes.io/projected/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-kube-api-access-wpttr\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.758455 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.760238 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsn95\" (UniqueName: \"kubernetes.io/projected/9894d9a4-5121-4345-ab9c-4f770f4e4bb0-kube-api-access-wsn95\") pod \"octavia-operator-controller-manager-998648c74-dwt5n\" (UID: \"9894d9a4-5121-4345-ab9c-4f770f4e4bb0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.760954 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w25f\" (UniqueName: \"kubernetes.io/projected/0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da-kube-api-access-2w25f\") pod \"ovn-operator-controller-manager-b6456fdb6-j587r\" (UID: \"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.764126 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.765775 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.770079 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-95r2m" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.774064 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.808730 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.816099 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.817804 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l782n\" (UniqueName: \"kubernetes.io/projected/b6222e18-ffef-4dc6-b327-3b06bb91d75a-kube-api-access-l782n\") pod \"telemetry-operator-controller-manager-6578c5f884-gml69\" (UID: \"b6222e18-ffef-4dc6-b327-3b06bb91d75a\") " pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.817885 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5wg\" (UniqueName: \"kubernetes.io/projected/617fefa5-c3f6-450e-a569-8ee3dd12f882-kube-api-access-7q5wg\") pod \"test-operator-controller-manager-5854674fcc-8m9f7\" (UID: \"617fefa5-c3f6-450e-a569-8ee3dd12f882\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.817914 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bzd5\" (UniqueName: \"kubernetes.io/projected/da012733-6903-4607-9be5-17c81d20ae6b-kube-api-access-8bzd5\") pod \"swift-operator-controller-manager-5f8c65bbfc-qxf6v\" (UID: \"da012733-6903-4607-9be5-17c81d20ae6b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.817960 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rj4\" (UniqueName: \"kubernetes.io/projected/e081130e-f14c-489c-9e4e-faab3dbdee6c-kube-api-access-s2rj4\") pod \"watcher-operator-controller-manager-769dc69bc-w42fg\" (UID: \"e081130e-f14c-489c-9e4e-faab3dbdee6c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.818321 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.818350 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkln\" (UniqueName: \"kubernetes.io/projected/f35362c5-4886-42dc-a633-c018e7f6aaf2-kube-api-access-6tkln\") pod \"placement-operator-controller-manager-78f8948974-f9b75\" (UID: \"f35362c5-4886-42dc-a633-c018e7f6aaf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.818811 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: E1205 00:42:34.818854 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:42:35.81884197 +0000 UTC m=+1175.034502920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.819985 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.822432 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.822671 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.823317 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pdfnv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.840014 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bzd5\" (UniqueName: \"kubernetes.io/projected/da012733-6903-4607-9be5-17c81d20ae6b-kube-api-access-8bzd5\") pod \"swift-operator-controller-manager-5f8c65bbfc-qxf6v\" (UID: \"da012733-6903-4607-9be5-17c81d20ae6b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.840493 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.847914 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkln\" (UniqueName: \"kubernetes.io/projected/f35362c5-4886-42dc-a633-c018e7f6aaf2-kube-api-access-6tkln\") pod \"placement-operator-controller-manager-78f8948974-f9b75\" (UID: \"f35362c5-4886-42dc-a633-c018e7f6aaf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.861323 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.862500 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.869204 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8rlp7" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.871029 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm"] Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.896439 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.908891 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.921108 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqjf\" (UniqueName: \"kubernetes.io/projected/d832c1ee-6d66-4cd7-87eb-dc2d34f801cc-kube-api-access-mcqjf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85hsm\" (UID: \"d832c1ee-6d66-4cd7-87eb-dc2d34f801cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.921177 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5wg\" (UniqueName: \"kubernetes.io/projected/617fefa5-c3f6-450e-a569-8ee3dd12f882-kube-api-access-7q5wg\") pod \"test-operator-controller-manager-5854674fcc-8m9f7\" (UID: \"617fefa5-c3f6-450e-a569-8ee3dd12f882\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.921456 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.921510 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhk6q\" (UniqueName: \"kubernetes.io/projected/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-kube-api-access-rhk6q\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.921543 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l782n\" (UniqueName: \"kubernetes.io/projected/b6222e18-ffef-4dc6-b327-3b06bb91d75a-kube-api-access-l782n\") pod \"telemetry-operator-controller-manager-6578c5f884-gml69\" (UID: \"b6222e18-ffef-4dc6-b327-3b06bb91d75a\") " pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.923182 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rj4\" (UniqueName: \"kubernetes.io/projected/e081130e-f14c-489c-9e4e-faab3dbdee6c-kube-api-access-s2rj4\") pod \"watcher-operator-controller-manager-769dc69bc-w42fg\" (UID: \"e081130e-f14c-489c-9e4e-faab3dbdee6c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.923261 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.943357 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rj4\" (UniqueName: \"kubernetes.io/projected/e081130e-f14c-489c-9e4e-faab3dbdee6c-kube-api-access-s2rj4\") pod \"watcher-operator-controller-manager-769dc69bc-w42fg\" (UID: \"e081130e-f14c-489c-9e4e-faab3dbdee6c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.953440 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.956979 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l782n\" (UniqueName: \"kubernetes.io/projected/b6222e18-ffef-4dc6-b327-3b06bb91d75a-kube-api-access-l782n\") pod \"telemetry-operator-controller-manager-6578c5f884-gml69\" (UID: \"b6222e18-ffef-4dc6-b327-3b06bb91d75a\") " pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:42:34 crc kubenswrapper[4759]: I1205 00:42:34.965828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5wg\" (UniqueName: \"kubernetes.io/projected/617fefa5-c3f6-450e-a569-8ee3dd12f882-kube-api-access-7q5wg\") pod \"test-operator-controller-manager-5854674fcc-8m9f7\" (UID: \"617fefa5-c3f6-450e-a569-8ee3dd12f882\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.024361 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.024427 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhk6q\" (UniqueName: \"kubernetes.io/projected/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-kube-api-access-rhk6q\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.024501 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.024601 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqjf\" (UniqueName: \"kubernetes.io/projected/d832c1ee-6d66-4cd7-87eb-dc2d34f801cc-kube-api-access-mcqjf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85hsm\" (UID: \"d832c1ee-6d66-4cd7-87eb-dc2d34f801cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.025033 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.025079 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:35.525065909 +0000 UTC m=+1174.740726859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.025333 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.025366 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:35.525358437 +0000 UTC m=+1174.741019387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.052013 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqjf\" (UniqueName: \"kubernetes.io/projected/d832c1ee-6d66-4cd7-87eb-dc2d34f801cc-kube-api-access-mcqjf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-85hsm\" (UID: \"d832c1ee-6d66-4cd7-87eb-dc2d34f801cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.055806 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhk6q\" (UniqueName: \"kubernetes.io/projected/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-kube-api-access-rhk6q\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.096230 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.124762 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.156264 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.171460 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.177329 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.233274 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.233566 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.233666 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:42:36.233643896 +0000 UTC m=+1175.449304926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.253450 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310627fe_09af_4a51_8312_e2b3841d6634.slice/crio-42802d02d2d98878bf2af0cc6156a2cd48072987b74dbee2ac49464a8e31f56b WatchSource:0}: Error finding container 42802d02d2d98878bf2af0cc6156a2cd48072987b74dbee2ac49464a8e31f56b: Status 404 returned error can't find the container with id 42802d02d2d98878bf2af0cc6156a2cd48072987b74dbee2ac49464a8e31f56b Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.270059 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.333342 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.341002 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj"] Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.371865 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf33139_8ab8_400c_8a2a_bf746d11f7e7.slice/crio-0d010c267ef3a9b91e83698b135bab836141211c5c1914a39c91567418f77d75 WatchSource:0}: Error finding container 0d010c267ef3a9b91e83698b135bab836141211c5c1914a39c91567418f77d75: Status 404 returned error can't find the container with id 0d010c267ef3a9b91e83698b135bab836141211c5c1914a39c91567418f77d75 Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.446217 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" event={"ID":"310627fe-09af-4a51-8312-e2b3841d6634","Type":"ContainerStarted","Data":"42802d02d2d98878bf2af0cc6156a2cd48072987b74dbee2ac49464a8e31f56b"} Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.450426 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" event={"ID":"5e28b15c-c39e-463a-b9a2-6f6df5addaf8","Type":"ContainerStarted","Data":"67c9e267c377297e6cf10ba67e2b448576820cec31d4ab7a50f7cbe858aa7e81"} Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.452544 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" event={"ID":"faf33139-8ab8-400c-8a2a-bf746d11f7e7","Type":"ContainerStarted","Data":"0d010c267ef3a9b91e83698b135bab836141211c5c1914a39c91567418f77d75"} Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.540712 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.540817 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.540863 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.540947 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.540951 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:36.540929914 +0000 UTC m=+1175.756590884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.541007 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:36.540993405 +0000 UTC m=+1175.756654355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.775490 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn"] Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.794339 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785b512b_7fa8_4480_b042_3811f10e3659.slice/crio-3f41ac5e975580a80c04443e2d0c9d23f38e6e811accaf07d43932f4f859342f WatchSource:0}: Error finding container 3f41ac5e975580a80c04443e2d0c9d23f38e6e811accaf07d43932f4f859342f: Status 404 returned error can't find the container with id 3f41ac5e975580a80c04443e2d0c9d23f38e6e811accaf07d43932f4f859342f Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.797349 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9"] Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.808394 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07c10f0_5bec_4421_8ff0_2c659e42377b.slice/crio-5fd2c91136e19fd57071566e2256157e5b7e49c6891dfaa05476ce97f88c5bf9 WatchSource:0}: Error finding container 5fd2c91136e19fd57071566e2256157e5b7e49c6891dfaa05476ce97f88c5bf9: Status 404 returned error can't find the container with id 5fd2c91136e19fd57071566e2256157e5b7e49c6891dfaa05476ce97f88c5bf9 Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.809485 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda712ae8c_434d_43f7_ab4d_b385eee4eabf.slice/crio-82a9a01421a20cecbfd0440da4bc314f121345e12fddab5466b0953a34eabb0e WatchSource:0}: Error finding container 82a9a01421a20cecbfd0440da4bc314f121345e12fddab5466b0953a34eabb0e: Status 404 returned error can't find the container with id 82a9a01421a20cecbfd0440da4bc314f121345e12fddab5466b0953a34eabb0e Dec 05 00:42:35 crc kubenswrapper[4759]: W1205 00:42:35.814774 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc0cab7_b730_4ada_994d_eb8ae2e014df.slice/crio-893b5188983cbcd2b4f7fb3826dcadf2d8a2cfa6a5436cd2098129744b6f4fd4 WatchSource:0}: Error finding container 893b5188983cbcd2b4f7fb3826dcadf2d8a2cfa6a5436cd2098129744b6f4fd4: Status 404 returned error can't find the container with id 893b5188983cbcd2b4f7fb3826dcadf2d8a2cfa6a5436cd2098129744b6f4fd4 Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.821656 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.833894 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.840506 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.844526 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.846060 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.846289 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: E1205 00:42:35.846379 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:42:37.846360906 +0000 UTC m=+1177.062021856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.851767 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8"] Dec 05 00:42:35 crc kubenswrapper[4759]: I1205 00:42:35.857619 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4"] Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.060991 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n"] Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.073632 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-f9b75"] Dec 05 00:42:36 crc kubenswrapper[4759]: W1205 00:42:36.086721 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9894d9a4_5121_4345_ab9c_4f770f4e4bb0.slice/crio-86f84146e2c0742868eada17c3e8c862fc08ccb9d3fb62e0e82e760a80291507 WatchSource:0}: Error finding container 86f84146e2c0742868eada17c3e8c862fc08ccb9d3fb62e0e82e760a80291507: Status 404 returned error can't find the container with id 86f84146e2c0742868eada17c3e8c862fc08ccb9d3fb62e0e82e760a80291507 Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.106385 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w"] Dec 05 00:42:36 crc kubenswrapper[4759]: W1205 00:42:36.111208 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dcdddec_138e_46fd_ab1d_15e4c4a06a15.slice/crio-17b8f878180b7f21baf4860701d7d4add196379378e447f09bc032037ad031c2 WatchSource:0}: Error finding container 17b8f878180b7f21baf4860701d7d4add196379378e447f09bc032037ad031c2: Status 404 returned error can't find the container with id 17b8f878180b7f21baf4860701d7d4add196379378e447f09bc032037ad031c2 Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.114104 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r"] Dec 05 00:42:36 crc kubenswrapper[4759]: W1205 00:42:36.120692 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfe4b5d_2f8e_4378_aee7_be0c44e2e3da.slice/crio-9df0e8461cc79886a256203a038fd812d7a9781f06951f0283f8fe6aa59cb9b6 WatchSource:0}: Error finding container 9df0e8461cc79886a256203a038fd812d7a9781f06951f0283f8fe6aa59cb9b6: Status 404 returned error can't find the container with id 9df0e8461cc79886a256203a038fd812d7a9781f06951f0283f8fe6aa59cb9b6 Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.131156 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2w25f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-j587r_openstack-operators(0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.144673 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2w25f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-j587r_openstack-operators(0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.146426 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" podUID="0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.161032 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v"] Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.172128 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8bzd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-qxf6v_openstack-operators(da012733-6903-4607-9be5-17c81d20ae6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.174412 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8bzd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-qxf6v_openstack-operators(da012733-6903-4607-9be5-17c81d20ae6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.176404 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" podUID="da012733-6903-4607-9be5-17c81d20ae6b" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.253928 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.254077 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.254134 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:42:38.254117558 +0000 UTC m=+1177.469778508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.331085 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7"] Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.339515 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69"] Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.352623 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm"] Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.352992 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg"] Dec 05 00:42:36 crc kubenswrapper[4759]: W1205 00:42:36.360599 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6222e18_ffef_4dc6_b327_3b06bb91d75a.slice/crio-436a8a4b50bcecca6f04a593e93b25bb39cc93c2259084ff3b612b5140a9d943 WatchSource:0}: Error finding container 436a8a4b50bcecca6f04a593e93b25bb39cc93c2259084ff3b612b5140a9d943: Status 404 returned error can't find the container with id 436a8a4b50bcecca6f04a593e93b25bb39cc93c2259084ff3b612b5140a9d943 Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.365246 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.129.56.107:5001/openstack-k8s-operators/telemetry-operator:d41273755bc130d021645570cb35db3b5f04d199,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l782n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6578c5f884-gml69_openstack-operators(b6222e18-ffef-4dc6-b327-3b06bb91d75a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.377652 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l782n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6578c5f884-gml69_openstack-operators(b6222e18-ffef-4dc6-b327-3b06bb91d75a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.381012 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" podUID="b6222e18-ffef-4dc6-b327-3b06bb91d75a" Dec 05 00:42:36 crc kubenswrapper[4759]: W1205 00:42:36.396487 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd832c1ee_6d66_4cd7_87eb_dc2d34f801cc.slice/crio-257e5a4d72f27e9067598fd3662d8039b190fb9395972700fecb69a9747414c5 WatchSource:0}: Error finding container 257e5a4d72f27e9067598fd3662d8039b190fb9395972700fecb69a9747414c5: Status 404 returned error can't find the container with id 257e5a4d72f27e9067598fd3662d8039b190fb9395972700fecb69a9747414c5 Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.402421 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcqjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-85hsm_openstack-operators(d832c1ee-6d66-4cd7-87eb-dc2d34f801cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.402732 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2rj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-w42fg_openstack-operators(e081130e-f14c-489c-9e4e-faab3dbdee6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.403593 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podUID="d832c1ee-6d66-4cd7-87eb-dc2d34f801cc" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.406385 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2rj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-w42fg_openstack-operators(e081130e-f14c-489c-9e4e-faab3dbdee6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.407651 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" podUID="e081130e-f14c-489c-9e4e-faab3dbdee6c" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.461110 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" event={"ID":"d832c1ee-6d66-4cd7-87eb-dc2d34f801cc","Type":"ContainerStarted","Data":"257e5a4d72f27e9067598fd3662d8039b190fb9395972700fecb69a9747414c5"} Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.462253 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podUID="d832c1ee-6d66-4cd7-87eb-dc2d34f801cc" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.463125 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" event={"ID":"a712ae8c-434d-43f7-ab4d-b385eee4eabf","Type":"ContainerStarted","Data":"82a9a01421a20cecbfd0440da4bc314f121345e12fddab5466b0953a34eabb0e"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.464190 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" event={"ID":"f0212865-8c85-4b7c-855c-baa0fc705bf8","Type":"ContainerStarted","Data":"a4776378b5229a6e5661ba8cece4df7a0a8745b3eeb28f90899b9301cde83867"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.466038 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" event={"ID":"e081130e-f14c-489c-9e4e-faab3dbdee6c","Type":"ContainerStarted","Data":"93a113979ab81e863c779e351b04284dea20121901e21ad229df9f0d8699cbc4"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.467411 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" event={"ID":"1626dead-b9fd-4fae-af93-e2332112626f","Type":"ContainerStarted","Data":"47b98daa52d552114edf0cda4960ca22dfcc0ddc331a6f20f672da2109124268"} Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.467927 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" podUID="e081130e-f14c-489c-9e4e-faab3dbdee6c" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.469952 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" event={"ID":"f07c10f0-5bec-4421-8ff0-2c659e42377b","Type":"ContainerStarted","Data":"5fd2c91136e19fd57071566e2256157e5b7e49c6891dfaa05476ce97f88c5bf9"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.471588 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" event={"ID":"617fefa5-c3f6-450e-a569-8ee3dd12f882","Type":"ContainerStarted","Data":"fdd4846c16977ead5798054d26ea2b86b1a0bee783efe2a8c674ac753430cde9"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.474298 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" event={"ID":"cbc0cab7-b730-4ada-994d-eb8ae2e014df","Type":"ContainerStarted","Data":"893b5188983cbcd2b4f7fb3826dcadf2d8a2cfa6a5436cd2098129744b6f4fd4"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.476382 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" event={"ID":"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da","Type":"ContainerStarted","Data":"9df0e8461cc79886a256203a038fd812d7a9781f06951f0283f8fe6aa59cb9b6"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.478796 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" event={"ID":"da012733-6903-4607-9be5-17c81d20ae6b","Type":"ContainerStarted","Data":"1af130376abaf5dbf6c42adde5a882678ff00d8aaf36ff02c7124c5a007de0ef"} Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.479089 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" podUID="0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.482139 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" event={"ID":"b6222e18-ffef-4dc6-b327-3b06bb91d75a","Type":"ContainerStarted","Data":"436a8a4b50bcecca6f04a593e93b25bb39cc93c2259084ff3b612b5140a9d943"} Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.483981 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" podUID="da012733-6903-4607-9be5-17c81d20ae6b" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.485187 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" event={"ID":"2dcdddec-138e-46fd-ab1d-15e4c4a06a15","Type":"ContainerStarted","Data":"17b8f878180b7f21baf4860701d7d4add196379378e447f09bc032037ad031c2"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.512396 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" event={"ID":"7377061e-a243-49b5-9728-4aaa2462445e","Type":"ContainerStarted","Data":"ad535d07ff7350e6bff547e7daa4c9cebdcdfca3187d3aa70e864a4d734a8265"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.518821 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" event={"ID":"f35362c5-4886-42dc-a633-c018e7f6aaf2","Type":"ContainerStarted","Data":"efc425b3b5c434157a7547ac785f4a7217198008782642d5e8b65bed1270306f"} Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.521216 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.107:5001/openstack-k8s-operators/telemetry-operator:d41273755bc130d021645570cb35db3b5f04d199\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" podUID="b6222e18-ffef-4dc6-b327-3b06bb91d75a" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.533113 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" event={"ID":"9894d9a4-5121-4345-ab9c-4f770f4e4bb0","Type":"ContainerStarted","Data":"86f84146e2c0742868eada17c3e8c862fc08ccb9d3fb62e0e82e760a80291507"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.539481 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" event={"ID":"785b512b-7fa8-4480-b042-3811f10e3659","Type":"ContainerStarted","Data":"3f41ac5e975580a80c04443e2d0c9d23f38e6e811accaf07d43932f4f859342f"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.541031 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" event={"ID":"976da4b0-9b83-4ffe-9cf2-a07c3e149e04","Type":"ContainerStarted","Data":"bff93e879dc4ad7e2a0d9e0d0de1db1143cfd763a12653d50aac95f0c3be9825"} Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.558916 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:36 crc kubenswrapper[4759]: I1205 00:42:36.559051 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.559230 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.559279 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:38.559265183 +0000 UTC m=+1177.774926133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.559440 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:36 crc kubenswrapper[4759]: E1205 00:42:36.559519 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:38.559501699 +0000 UTC m=+1177.775162649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.555160 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" podUID="e081130e-f14c-489c-9e4e-faab3dbdee6c" Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.555226 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podUID="d832c1ee-6d66-4cd7-87eb-dc2d34f801cc" Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.559384 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" podUID="0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da" Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.560259 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" podUID="da012733-6903-4607-9be5-17c81d20ae6b" Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.560623 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.107:5001/openstack-k8s-operators/telemetry-operator:d41273755bc130d021645570cb35db3b5f04d199\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" podUID="b6222e18-ffef-4dc6-b327-3b06bb91d75a" Dec 05 00:42:37 crc kubenswrapper[4759]: I1205 00:42:37.909556 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.909687 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:37 crc kubenswrapper[4759]: E1205 00:42:37.909819 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:42:41.909804754 +0000 UTC m=+1181.125465704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: I1205 00:42:38.316871 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.317082 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.317336 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:42:42.317316981 +0000 UTC m=+1181.532977931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: I1205 00:42:38.623738 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:38 crc kubenswrapper[4759]: I1205 00:42:38.623878 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.624276 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.624414 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:42.624339432 +0000 UTC m=+1181.840000382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.624479 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:38 crc kubenswrapper[4759]: E1205 00:42:38.624515 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:42.624508476 +0000 UTC m=+1181.840169426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:41 crc kubenswrapper[4759]: I1205 00:42:41.981284 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:41.981446 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:41.981783 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:42:49.981763918 +0000 UTC m=+1189.197424868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: I1205 00:42:42.388061 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.388251 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.388435 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:42:50.388415152 +0000 UTC m=+1189.604076102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: I1205 00:42:42.694145 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:42 crc kubenswrapper[4759]: I1205 00:42:42.694444 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.694455 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.694527 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:50.694509701 +0000 UTC m=+1189.910170651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.694614 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:42 crc kubenswrapper[4759]: E1205 00:42:42.694665 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:42:50.694650204 +0000 UTC m=+1189.910311154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:48 crc kubenswrapper[4759]: E1205 00:42:48.598742 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 05 00:42:48 crc kubenswrapper[4759]: E1205 00:42:48.599420 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbx7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-6gh82_openstack-operators(faf33139-8ab8-400c-8a2a-bf746d11f7e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:49 crc kubenswrapper[4759]: E1205 00:42:49.229988 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 05 00:42:49 crc kubenswrapper[4759]: E1205 00:42:49.230757 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j7wvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-s6jzr_openstack-operators(7377061e-a243-49b5-9728-4aaa2462445e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:49 crc kubenswrapper[4759]: E1205 00:42:49.814150 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 00:42:49 crc kubenswrapper[4759]: E1205 00:42:49.814378 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jpld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-dcs2p_openstack-operators(976da4b0-9b83-4ffe-9cf2-a07c3e149e04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:50 crc kubenswrapper[4759]: I1205 00:42:50.014374 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.014567 4759 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.014639 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert podName:1cce914e-3baa-4146-a52c-e054ee0c1eed nodeName:}" failed. No retries permitted until 2025-12-05 00:43:06.01461898 +0000 UTC m=+1205.230279930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert") pod "infra-operator-controller-manager-57548d458d-9jfsw" (UID: "1cce914e-3baa-4146-a52c-e054ee0c1eed") : secret "infra-operator-webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.391077 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.391665 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t47jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_openstack-operators(2dcdddec-138e-46fd-ab1d-15e4c4a06a15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:50 crc kubenswrapper[4759]: I1205 00:42:50.420767 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.420904 4759 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.420990 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert podName:33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9 nodeName:}" failed. No retries permitted until 2025-12-05 00:43:06.420970367 +0000 UTC m=+1205.636631317 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" (UID: "33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: I1205 00:42:50.726047 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:50 crc kubenswrapper[4759]: I1205 00:42:50.726178 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.726269 4759 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.726367 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:43:06.726348108 +0000 UTC m=+1205.942009058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "metrics-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.726369 4759 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.726427 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs podName:4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba nodeName:}" failed. No retries permitted until 2025-12-05 00:43:06.72641185 +0000 UTC m=+1205.942072800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs") pod "openstack-operator-controller-manager-759bbb976c-dtqzv" (UID: "4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba") : secret "webhook-server-cert" not found Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.976739 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 05 00:42:50 crc kubenswrapper[4759]: E1205 00:42:50.976945 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5t8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-lrxdn_openstack-operators(f0212865-8c85-4b7c-855c-baa0fc705bf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:51 crc kubenswrapper[4759]: E1205 00:42:51.979511 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 05 00:42:51 crc kubenswrapper[4759]: E1205 00:42:51.979932 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsn95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-dwt5n_openstack-operators(9894d9a4-5121-4345-ab9c-4f770f4e4bb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:52 crc kubenswrapper[4759]: E1205 00:42:52.610424 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 00:42:52 crc kubenswrapper[4759]: E1205 00:42:52.610631 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krjkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-5h7t4_openstack-operators(1626dead-b9fd-4fae-af93-e2332112626f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:42:55 crc kubenswrapper[4759]: I1205 00:42:55.709042 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" event={"ID":"f07c10f0-5bec-4421-8ff0-2c659e42377b","Type":"ContainerStarted","Data":"10f4a368b035545b9ced411ed0d9f59c6527db00680ae13c7c8ac33b43e442fb"} Dec 05 00:42:55 crc kubenswrapper[4759]: I1205 00:42:55.712041 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" event={"ID":"310627fe-09af-4a51-8312-e2b3841d6634","Type":"ContainerStarted","Data":"c9536de216dab0b57764c51d30bb930efad7365ad7b312932dea69e7f437a0f3"} Dec 05 00:42:56 crc kubenswrapper[4759]: I1205 00:42:56.725212 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" event={"ID":"617fefa5-c3f6-450e-a569-8ee3dd12f882","Type":"ContainerStarted","Data":"0aff1b6b92d17d6cd03b40d5f8d086891877b526c8bdba0f6d72d91da9ba55f0"} Dec 05 00:42:56 crc kubenswrapper[4759]: I1205 00:42:56.735975 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" event={"ID":"cbc0cab7-b730-4ada-994d-eb8ae2e014df","Type":"ContainerStarted","Data":"0aba06d65a6113dcb08a87b6b6c903beea397500213b77c92bc5389eafe50b4e"} Dec 05 00:42:56 crc kubenswrapper[4759]: I1205 00:42:56.738479 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" event={"ID":"a712ae8c-434d-43f7-ab4d-b385eee4eabf","Type":"ContainerStarted","Data":"befcb25c6a95f6ef22ff90369d1f9f841a408ab19512d7f046e72ef38028df4c"} Dec 05 00:42:56 crc kubenswrapper[4759]: I1205 00:42:56.757776 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" event={"ID":"785b512b-7fa8-4480-b042-3811f10e3659","Type":"ContainerStarted","Data":"53a9b7d3966de7ccb28bda34521bb0eef688d1b240dca4dfe157ece6472cc8e0"} Dec 05 00:42:57 crc kubenswrapper[4759]: I1205 00:42:57.775648 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" event={"ID":"5e28b15c-c39e-463a-b9a2-6f6df5addaf8","Type":"ContainerStarted","Data":"bdcdf414b1b36c9cf01b666c4f0c557c854f40980c994dceb15c581272f7b87d"} Dec 05 00:42:57 crc kubenswrapper[4759]: I1205 00:42:57.777446 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" event={"ID":"f35362c5-4886-42dc-a633-c018e7f6aaf2","Type":"ContainerStarted","Data":"4b60cb43bfb87b80dda299551b06dcc09b32fbe3bf28dc3749416d9a78a8215d"} Dec 05 00:42:57 crc kubenswrapper[4759]: I1205 00:42:57.780922 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" event={"ID":"e081130e-f14c-489c-9e4e-faab3dbdee6c","Type":"ContainerStarted","Data":"c1f6fb88a9d23de2435c2ae9812811f30cec55e22801387df3a39b0871bc42a5"} Dec 05 00:42:59 crc kubenswrapper[4759]: I1205 00:42:59.814225 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" event={"ID":"da012733-6903-4607-9be5-17c81d20ae6b","Type":"ContainerStarted","Data":"f1ca0bb4be179f418d1b2740ba2a539b470df237b13ae9d0574c916b58fad750"} Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.053522 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.063586 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cce914e-3baa-4146-a52c-e054ee0c1eed-cert\") pod \"infra-operator-controller-manager-57548d458d-9jfsw\" (UID: \"1cce914e-3baa-4146-a52c-e054ee0c1eed\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.159474 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5j9x9" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.167521 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.465521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.477851 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g\" (UID: \"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.668044 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bl7ft" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.676649 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.769544 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.769672 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.775793 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-metrics-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:06 crc kubenswrapper[4759]: I1205 00:43:06.776583 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba-webhook-certs\") pod \"openstack-operator-controller-manager-759bbb976c-dtqzv\" (UID: \"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba\") " pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:07 crc kubenswrapper[4759]: I1205 00:43:07.002927 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pdfnv" Dec 05 00:43:07 crc kubenswrapper[4759]: I1205 00:43:07.011172 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.551814 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.552754 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5t8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-lrxdn_openstack-operators(f0212865-8c85-4b7c-855c-baa0fc705bf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.553954 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" podUID="f0212865-8c85-4b7c-855c-baa0fc705bf8" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.575600 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.575754 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbx7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-6gh82_openstack-operators(faf33139-8ab8-400c-8a2a-bf746d11f7e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.576972 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" podUID="faf33139-8ab8-400c-8a2a-bf746d11f7e7" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.596051 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.596210 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jpld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-dcs2p_openstack-operators(976da4b0-9b83-4ffe-9cf2-a07c3e149e04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:12 crc kubenswrapper[4759]: E1205 00:43:12.597606 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" podUID="976da4b0-9b83-4ffe-9cf2-a07c3e149e04" Dec 05 00:43:12 crc kubenswrapper[4759]: I1205 00:43:12.955908 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" event={"ID":"b6222e18-ffef-4dc6-b327-3b06bb91d75a","Type":"ContainerStarted","Data":"f0191404c7e9b95593500f931b507eaa3db99626c49e1bfae6d0ae616ca5c432"} Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.197439 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.197661 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j7wvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-s6jzr_openstack-operators(7377061e-a243-49b5-9728-4aaa2462445e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.198936 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" podUID="7377061e-a243-49b5-9728-4aaa2462445e" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.214667 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.214878 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krjkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-5h7t4_openstack-operators(1626dead-b9fd-4fae-af93-e2332112626f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.216199 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" podUID="1626dead-b9fd-4fae-af93-e2332112626f" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.217551 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.217682 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsn95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-dwt5n_openstack-operators(9894d9a4-5121-4345-ab9c-4f770f4e4bb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.219099 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" podUID="9894d9a4-5121-4345-ab9c-4f770f4e4bb0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.640185 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.640559 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w9jxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-zczgr_openstack-operators(310627fe-09af-4a51-8312-e2b3841d6634): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.641887 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" podUID="310627fe-09af-4a51-8312-e2b3841d6634" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.645612 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.645705 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcqjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-85hsm_openstack-operators(d832c1ee-6d66-4cd7-87eb-dc2d34f801cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.646855 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podUID="d832c1ee-6d66-4cd7-87eb-dc2d34f801cc" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.671450 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.671627 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t47jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_openstack-operators(2dcdddec-138e-46fd-ab1d-15e4c4a06a15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.672835 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" podUID="2dcdddec-138e-46fd-ab1d-15e4c4a06a15" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.721725 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.722037 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cb7ww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-jrlg9_openstack-operators(a712ae8c-434d-43f7-ab4d-b385eee4eabf): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.724093 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" podUID="a712ae8c-434d-43f7-ab4d-b385eee4eabf" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.730896 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.731028 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vrrcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-7s2r8_openstack-operators(f07c10f0-5bec-4421-8ff0-2c659e42377b): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.732205 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" podUID="f07c10f0-5bec-4421-8ff0-2c659e42377b" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.734566 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.734691 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8hq2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-ckxbs_openstack-operators(cbc0cab7-b730-4ada-994d-eb8ae2e014df): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.735921 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" podUID="cbc0cab7-b730-4ada-994d-eb8ae2e014df" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.753097 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.753281 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7q5wg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8m9f7_openstack-operators(617fefa5-c3f6-450e-a569-8ee3dd12f882): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 05 00:43:14 crc kubenswrapper[4759]: E1205 00:43:14.754502 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" podUID="617fefa5-c3f6-450e-a569-8ee3dd12f882" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.978887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" event={"ID":"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da","Type":"ContainerStarted","Data":"2284796bfd33618fe63463170a0b85e7074110eae83cb916bb7e40467aa6123d"} Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.979214 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.980606 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.980762 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.980813 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.988191 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.988492 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.988563 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" Dec 05 00:43:14 crc kubenswrapper[4759]: I1205 00:43:14.990096 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.279376 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv"] Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.307298 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g"] Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.316519 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw"] Dec 05 00:43:15 crc kubenswrapper[4759]: W1205 00:43:15.338544 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f594b2_42dc_4c4e_95e4_ebdfe6c96ee9.slice/crio-f69e7185a49e2c6ad1f0651935a28231ff663b35e8e4838afe29cd4195f22e8d WatchSource:0}: Error finding container f69e7185a49e2c6ad1f0651935a28231ff663b35e8e4838afe29cd4195f22e8d: Status 404 returned error can't find the container with id f69e7185a49e2c6ad1f0651935a28231ff663b35e8e4838afe29cd4195f22e8d Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.985654 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" event={"ID":"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba","Type":"ContainerStarted","Data":"1e4b8d67bd85fd321eb19fead41a227a6bbb1e36a1504f11c267f1085530a2b3"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.991148 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" event={"ID":"976da4b0-9b83-4ffe-9cf2-a07c3e149e04","Type":"ContainerStarted","Data":"c0895a2e21be18cf6fdcb50e368457e38ac03b755c3de43f0f3a0b39664f7f73"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.992814 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" event={"ID":"b6222e18-ffef-4dc6-b327-3b06bb91d75a","Type":"ContainerStarted","Data":"87d68e083e7d3a6c7fe03798645078dbf9d1997b7512e2886c3ea805481ce521"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.992933 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.994640 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" event={"ID":"5e28b15c-c39e-463a-b9a2-6f6df5addaf8","Type":"ContainerStarted","Data":"8029efbfb7b44e802ce9b6ea5386d6ab30e89c7cddb7517a95a7173b729345a8"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.996445 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" event={"ID":"785b512b-7fa8-4480-b042-3811f10e3659","Type":"ContainerStarted","Data":"f9e8307537ef4d867b4e5c30331b48a0e2f7465192c7755e14e088e3a7affeba"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.998191 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" event={"ID":"0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da","Type":"ContainerStarted","Data":"1437a4a05ec3f328c76e1fbcce4bd809d35f16ad8916792ed2d908811d75ac48"} Dec 05 00:43:15 crc kubenswrapper[4759]: I1205 00:43:15.998330 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:15.999739 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" event={"ID":"e081130e-f14c-489c-9e4e-faab3dbdee6c","Type":"ContainerStarted","Data":"0ad6e845a59d4adcfdd9d62d01f8525078323cbc3c87993e6d6854a11892a19d"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.000510 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.002213 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.005503 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" event={"ID":"faf33139-8ab8-400c-8a2a-bf746d11f7e7","Type":"ContainerStarted","Data":"42006b6aa864157053f8d26783f59d6e8b85f5d625cc589867b400d65cb67bf5"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.007873 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" event={"ID":"617fefa5-c3f6-450e-a569-8ee3dd12f882","Type":"ContainerStarted","Data":"32d707e440c60f0b579062be22253b6480d33bf49fc9225a7b9d315bc07f88f2"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.014379 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" event={"ID":"a712ae8c-434d-43f7-ab4d-b385eee4eabf","Type":"ContainerStarted","Data":"d617406326f88b569d9c31ee3fcf14e5d280fe79bf5dc454deb1965ae3d02de0"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.017381 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" event={"ID":"cbc0cab7-b730-4ada-994d-eb8ae2e014df","Type":"ContainerStarted","Data":"cf7a83e10b754cc7be269be1eb7c42bc0b504aa9cba284d4c10388e1e55848cc"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.017626 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" podStartSLOduration=3.4251930489999998 podStartE2EDuration="42.017615717s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.36504788 +0000 UTC m=+1175.580708830" lastFinishedPulling="2025-12-05 00:43:14.957470538 +0000 UTC m=+1214.173131498" observedRunningTime="2025-12-05 00:43:16.016209152 +0000 UTC m=+1215.231870102" watchObservedRunningTime="2025-12-05 00:43:16.017615717 +0000 UTC m=+1215.233276667" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.020721 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" event={"ID":"f0212865-8c85-4b7c-855c-baa0fc705bf8","Type":"ContainerStarted","Data":"84195418e03497b27890b3051d7bc25efa1f1d9882691f62a743f461c85abf39"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.021564 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" event={"ID":"1cce914e-3baa-4146-a52c-e054ee0c1eed","Type":"ContainerStarted","Data":"9908238b07a0974fcc54ca7795e011673a4a8b869c36a1c91af875a54af08f18"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.022269 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" event={"ID":"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9","Type":"ContainerStarted","Data":"f69e7185a49e2c6ad1f0651935a28231ff663b35e8e4838afe29cd4195f22e8d"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.029223 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" event={"ID":"f35362c5-4886-42dc-a633-c018e7f6aaf2","Type":"ContainerStarted","Data":"ca51577d8f3405b6a88c61e25aae1e01f212ff650ef01a3511e3cd2dbd6b3c94"} Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.030440 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.034705 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.077326 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jrlg9" podStartSLOduration=25.035363821 podStartE2EDuration="43.077298396s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.815014174 +0000 UTC m=+1175.030675134" lastFinishedPulling="2025-12-05 00:42:53.856948759 +0000 UTC m=+1193.072609709" observedRunningTime="2025-12-05 00:43:16.047199155 +0000 UTC m=+1215.262860125" watchObservedRunningTime="2025-12-05 00:43:16.077298396 +0000 UTC m=+1215.292959346" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.097825 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8m9f7" podStartSLOduration=24.473564974 podStartE2EDuration="42.097809142s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.357775901 +0000 UTC m=+1175.573436851" lastFinishedPulling="2025-12-05 00:42:53.982020069 +0000 UTC m=+1193.197681019" observedRunningTime="2025-12-05 00:43:16.088745709 +0000 UTC m=+1215.304406659" watchObservedRunningTime="2025-12-05 00:43:16.097809142 +0000 UTC m=+1215.313470092" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.127752 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-w42fg" podStartSLOduration=3.599633414 podStartE2EDuration="42.127736308s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.402665606 +0000 UTC m=+1175.618326556" lastFinishedPulling="2025-12-05 00:43:14.9307685 +0000 UTC m=+1214.146429450" observedRunningTime="2025-12-05 00:43:16.12331661 +0000 UTC m=+1215.338977560" watchObservedRunningTime="2025-12-05 00:43:16.127736308 +0000 UTC m=+1215.343397258" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.165792 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" podStartSLOduration=21.56649515 podStartE2EDuration="42.165772936s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.131002416 +0000 UTC m=+1175.346663366" lastFinishedPulling="2025-12-05 00:42:56.730280202 +0000 UTC m=+1195.945941152" observedRunningTime="2025-12-05 00:43:16.164928725 +0000 UTC m=+1215.380589675" watchObservedRunningTime="2025-12-05 00:43:16.165772936 +0000 UTC m=+1215.381433886" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.210541 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-f9b75" podStartSLOduration=3.40043363 podStartE2EDuration="42.210521008s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.1197889 +0000 UTC m=+1175.335449850" lastFinishedPulling="2025-12-05 00:43:14.929876278 +0000 UTC m=+1214.145537228" observedRunningTime="2025-12-05 00:43:16.186172838 +0000 UTC m=+1215.401833778" watchObservedRunningTime="2025-12-05 00:43:16.210521008 +0000 UTC m=+1215.426181958" Dec 05 00:43:16 crc kubenswrapper[4759]: I1205 00:43:16.215411 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ckxbs" podStartSLOduration=24.180865554 podStartE2EDuration="42.215401238s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.822876057 +0000 UTC m=+1175.038537007" lastFinishedPulling="2025-12-05 00:42:53.857411741 +0000 UTC m=+1193.073072691" observedRunningTime="2025-12-05 00:43:16.205159146 +0000 UTC m=+1215.420820096" watchObservedRunningTime="2025-12-05 00:43:16.215401238 +0000 UTC m=+1215.431062188" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.038138 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" event={"ID":"faf33139-8ab8-400c-8a2a-bf746d11f7e7","Type":"ContainerStarted","Data":"26e4bfb8ca8ad114fa780c33653bf6f26e6e22a8f952dbaa4d978d55530819b9"} Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.038636 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.044753 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" event={"ID":"da012733-6903-4607-9be5-17c81d20ae6b","Type":"ContainerStarted","Data":"1fc46b86c15a47677915aa01b8c11b5a0748f7e3880bc957e38c7b06c3e84274"} Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.045938 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.055087 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.057068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" event={"ID":"f07c10f0-5bec-4421-8ff0-2c659e42377b","Type":"ContainerStarted","Data":"26a272236ccf3d6bdc53b18daf5b6baf4673235a029fe387c9948836b57a7d25"} Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.068228 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" event={"ID":"310627fe-09af-4a51-8312-e2b3841d6634","Type":"ContainerStarted","Data":"73837c14098903d35cdbba3a0a20b443793d7d33cae1bd249753dea87c8a6204"} Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.068299 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" podStartSLOduration=4.693961058 podStartE2EDuration="44.068281952s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.378806101 +0000 UTC m=+1174.594467051" lastFinishedPulling="2025-12-05 00:43:14.753126995 +0000 UTC m=+1213.968787945" observedRunningTime="2025-12-05 00:43:17.063074494 +0000 UTC m=+1216.278735444" watchObservedRunningTime="2025-12-05 00:43:17.068281952 +0000 UTC m=+1216.283942902" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.069221 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.073619 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.088680 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-qxf6v" podStartSLOduration=4.309136779 podStartE2EDuration="43.088664455s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.171954554 +0000 UTC m=+1175.387615504" lastFinishedPulling="2025-12-05 00:43:14.95148223 +0000 UTC m=+1214.167143180" observedRunningTime="2025-12-05 00:43:17.086530522 +0000 UTC m=+1216.302191472" watchObservedRunningTime="2025-12-05 00:43:17.088664455 +0000 UTC m=+1216.304325405" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.102554 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" event={"ID":"4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba","Type":"ContainerStarted","Data":"ce3be8413e082cb7fea359d30319187437c27f7e763020549a408cfb876aef90"} Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.103169 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.107193 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6578c5f884-gml69" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.111596 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.116057 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-7s2r8" podStartSLOduration=25.906586944 podStartE2EDuration="43.116039898s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.810501852 +0000 UTC m=+1175.026162802" lastFinishedPulling="2025-12-05 00:42:53.019954806 +0000 UTC m=+1192.235615756" observedRunningTime="2025-12-05 00:43:17.112359178 +0000 UTC m=+1216.328020118" watchObservedRunningTime="2025-12-05 00:43:17.116039898 +0000 UTC m=+1216.331700848" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.278425 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-zczgr" podStartSLOduration=26.528132886 podStartE2EDuration="44.278405077s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.269368986 +0000 UTC m=+1174.485029936" lastFinishedPulling="2025-12-05 00:42:53.019641167 +0000 UTC m=+1192.235302127" observedRunningTime="2025-12-05 00:43:17.275170068 +0000 UTC m=+1216.490831018" watchObservedRunningTime="2025-12-05 00:43:17.278405077 +0000 UTC m=+1216.494066027" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.288287 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" podStartSLOduration=43.28826896 podStartE2EDuration="43.28826896s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:43:17.206136538 +0000 UTC m=+1216.421797488" watchObservedRunningTime="2025-12-05 00:43:17.28826896 +0000 UTC m=+1216.503929910" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.315123 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-zh2wq" podStartSLOduration=4.009113479 podStartE2EDuration="43.31508287s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.801585303 +0000 UTC m=+1175.017246263" lastFinishedPulling="2025-12-05 00:43:15.107554704 +0000 UTC m=+1214.323215654" observedRunningTime="2025-12-05 00:43:17.299018175 +0000 UTC m=+1216.514679125" watchObservedRunningTime="2025-12-05 00:43:17.31508287 +0000 UTC m=+1216.530743820" Dec 05 00:43:17 crc kubenswrapper[4759]: I1205 00:43:17.351779 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" podStartSLOduration=4.592499639 podStartE2EDuration="44.351740654s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.376526894 +0000 UTC m=+1174.592187844" lastFinishedPulling="2025-12-05 00:43:15.135767909 +0000 UTC m=+1214.351428859" observedRunningTime="2025-12-05 00:43:17.319329265 +0000 UTC m=+1216.534990215" watchObservedRunningTime="2025-12-05 00:43:17.351740654 +0000 UTC m=+1216.567401604" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.109895 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" event={"ID":"f0212865-8c85-4b7c-855c-baa0fc705bf8","Type":"ContainerStarted","Data":"7cf0df8255a9200ad3fedee30f0c26a04a362feb7207c2f255bcf267321168ba"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.110065 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.112057 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" event={"ID":"9894d9a4-5121-4345-ab9c-4f770f4e4bb0","Type":"ContainerStarted","Data":"0eab6ceffdac0b3157c60b99ad8b0a3f1b8c70674f5ef1d11da021b3b0ac7b38"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.112093 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" event={"ID":"9894d9a4-5121-4345-ab9c-4f770f4e4bb0","Type":"ContainerStarted","Data":"02894df60c921b8f9063c1abbe7f479d6b4b13015f1a4437d94e5c2c930e681b"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.112381 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.114159 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" event={"ID":"2dcdddec-138e-46fd-ab1d-15e4c4a06a15","Type":"ContainerStarted","Data":"afd17971db7b57fb59e112864a0f2192268065eb1348c6285362122d9cd29b94"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.114189 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" event={"ID":"2dcdddec-138e-46fd-ab1d-15e4c4a06a15","Type":"ContainerStarted","Data":"dae7818fd3d7c07b0a0ea32507a0c389cabd646ebaa723ccc3be2cd1e9782aba"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.114348 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.115761 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" event={"ID":"1626dead-b9fd-4fae-af93-e2332112626f","Type":"ContainerStarted","Data":"9d18385206585abc82496220a4f92149f2d32fa2071276b3745dbf79716fc00f"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.115786 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" event={"ID":"1626dead-b9fd-4fae-af93-e2332112626f","Type":"ContainerStarted","Data":"436559b98f3dd7d663c2f926a9d88c07a772af538ceb2d255b0e1c3f81930364"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.116322 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.118607 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" event={"ID":"7377061e-a243-49b5-9728-4aaa2462445e","Type":"ContainerStarted","Data":"0f27eeeee40e578135d5f172db4c2d4820a00028c193ba6c5e308b59048a75a6"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.118631 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" event={"ID":"7377061e-a243-49b5-9728-4aaa2462445e","Type":"ContainerStarted","Data":"1d26cd7ad81fc566ca53d8b904ba9caba632ebea8d06d38ec69327423b16b3d7"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.118771 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.121801 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" event={"ID":"976da4b0-9b83-4ffe-9cf2-a07c3e149e04","Type":"ContainerStarted","Data":"5b055f881ce601034819c71f795217730f17ef3bc012b6de6ad333eafbe59481"} Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.122176 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.132633 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" podStartSLOduration=6.160827485 podStartE2EDuration="45.132620395s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.782212536 +0000 UTC m=+1174.997873506" lastFinishedPulling="2025-12-05 00:43:14.754005466 +0000 UTC m=+1213.969666416" observedRunningTime="2025-12-05 00:43:18.127445508 +0000 UTC m=+1217.343106488" watchObservedRunningTime="2025-12-05 00:43:18.132620395 +0000 UTC m=+1217.348281345" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.157079 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" podStartSLOduration=3.583007505 podStartE2EDuration="44.157064177s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.089413731 +0000 UTC m=+1175.305074681" lastFinishedPulling="2025-12-05 00:43:16.663470403 +0000 UTC m=+1215.879131353" observedRunningTime="2025-12-05 00:43:18.150744881 +0000 UTC m=+1217.366405851" watchObservedRunningTime="2025-12-05 00:43:18.157064177 +0000 UTC m=+1217.372725127" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.195963 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" podStartSLOduration=4.379927662 podStartE2EDuration="44.195942055s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.827428129 +0000 UTC m=+1175.043089069" lastFinishedPulling="2025-12-05 00:43:15.643442522 +0000 UTC m=+1214.859103462" observedRunningTime="2025-12-05 00:43:18.17667309 +0000 UTC m=+1217.392334040" watchObservedRunningTime="2025-12-05 00:43:18.195942055 +0000 UTC m=+1217.411603005" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.196915 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" podStartSLOduration=5.247164751 podStartE2EDuration="44.196910708s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.806684539 +0000 UTC m=+1175.022345489" lastFinishedPulling="2025-12-05 00:43:14.756430496 +0000 UTC m=+1213.972091446" observedRunningTime="2025-12-05 00:43:18.192361106 +0000 UTC m=+1217.408022056" watchObservedRunningTime="2025-12-05 00:43:18.196910708 +0000 UTC m=+1217.412571658" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.213093 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" podStartSLOduration=3.668942343 podStartE2EDuration="44.213075657s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.1193996 +0000 UTC m=+1175.335060560" lastFinishedPulling="2025-12-05 00:43:16.663532924 +0000 UTC m=+1215.879193874" observedRunningTime="2025-12-05 00:43:18.209059768 +0000 UTC m=+1217.424720718" watchObservedRunningTime="2025-12-05 00:43:18.213075657 +0000 UTC m=+1217.428736617" Dec 05 00:43:18 crc kubenswrapper[4759]: I1205 00:43:18.226680 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" podStartSLOduration=4.375880214 podStartE2EDuration="45.226660691s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:42:35.827490981 +0000 UTC m=+1175.043151931" lastFinishedPulling="2025-12-05 00:43:16.678271458 +0000 UTC m=+1215.893932408" observedRunningTime="2025-12-05 00:43:18.224453047 +0000 UTC m=+1217.440113997" watchObservedRunningTime="2025-12-05 00:43:18.226660691 +0000 UTC m=+1217.442321641" Dec 05 00:43:19 crc kubenswrapper[4759]: I1205 00:43:19.131354 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:43:20 crc kubenswrapper[4759]: I1205 00:43:20.144770 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-dcs2p" Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.151236 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" event={"ID":"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9","Type":"ContainerStarted","Data":"399a9428c6182b838c5ac333a6b045e959e9704b09ed09a339adf824c25ff6a5"} Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.151322 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" event={"ID":"33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9","Type":"ContainerStarted","Data":"2df291a0d17b5c913c1ca17310415effdf6b148faee3bf7cc78888e5722b3a61"} Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.151348 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.154263 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" event={"ID":"1cce914e-3baa-4146-a52c-e054ee0c1eed","Type":"ContainerStarted","Data":"2fb8bdd99ff6c322752ead20a04aa863154cbe269ec7d65623bc57f853d89e39"} Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.154332 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" event={"ID":"1cce914e-3baa-4146-a52c-e054ee0c1eed","Type":"ContainerStarted","Data":"ad5f8b0e6f124ee1dfe3decf841ffcb193f9352661165e4afb46e785411081d9"} Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.154423 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.211238 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" podStartSLOduration=42.154735383 podStartE2EDuration="47.211205974s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:43:15.34658114 +0000 UTC m=+1214.562242090" lastFinishedPulling="2025-12-05 00:43:20.403051711 +0000 UTC m=+1219.618712681" observedRunningTime="2025-12-05 00:43:21.195560669 +0000 UTC m=+1220.411221639" watchObservedRunningTime="2025-12-05 00:43:21.211205974 +0000 UTC m=+1220.426866964" Dec 05 00:43:21 crc kubenswrapper[4759]: I1205 00:43:21.250545 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" podStartSLOduration=43.198729936 podStartE2EDuration="48.250513832s" podCreationTimestamp="2025-12-05 00:42:33 +0000 UTC" firstStartedPulling="2025-12-05 00:43:15.349567824 +0000 UTC m=+1214.565228774" lastFinishedPulling="2025-12-05 00:43:20.40135169 +0000 UTC m=+1219.617012670" observedRunningTime="2025-12-05 00:43:21.232535699 +0000 UTC m=+1220.448196669" watchObservedRunningTime="2025-12-05 00:43:21.250513832 +0000 UTC m=+1220.466174792" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.190805 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.195536 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-ztxcj" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.302492 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-6gh82" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.326466 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-lrxdn" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.483001 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5h7t4" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.585043 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-s6jzr" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.621913 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-p5g9w" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.903833 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dwt5n" Dec 05 00:43:24 crc kubenswrapper[4759]: I1205 00:43:24.917542 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-j587r" Dec 05 00:43:26 crc kubenswrapper[4759]: I1205 00:43:26.178108 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9jfsw" Dec 05 00:43:26 crc kubenswrapper[4759]: I1205 00:43:26.686645 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g" Dec 05 00:43:27 crc kubenswrapper[4759]: I1205 00:43:27.018447 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-759bbb976c-dtqzv" Dec 05 00:43:27 crc kubenswrapper[4759]: E1205 00:43:27.159130 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podUID="d832c1ee-6d66-4cd7-87eb-dc2d34f801cc" Dec 05 00:43:42 crc kubenswrapper[4759]: I1205 00:43:42.396744 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" event={"ID":"d832c1ee-6d66-4cd7-87eb-dc2d34f801cc","Type":"ContainerStarted","Data":"9c6f0ce758476344af1e27f3babcba93e7382cd4c73281700fac4e397353acc9"} Dec 05 00:43:42 crc kubenswrapper[4759]: I1205 00:43:42.416764 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-85hsm" podStartSLOduration=3.2360204599999998 podStartE2EDuration="1m8.416741102s" podCreationTimestamp="2025-12-05 00:42:34 +0000 UTC" firstStartedPulling="2025-12-05 00:42:36.402282767 +0000 UTC m=+1175.617943717" lastFinishedPulling="2025-12-05 00:43:41.583003409 +0000 UTC m=+1240.798664359" observedRunningTime="2025-12-05 00:43:42.414406405 +0000 UTC m=+1241.630067375" watchObservedRunningTime="2025-12-05 00:43:42.416741102 +0000 UTC m=+1241.632402062" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.732618 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.734464 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.736954 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fd8ln" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.737943 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.743210 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.743504 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.750541 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.797131 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.798772 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.811389 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.813576 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.889927 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwp2n\" (UniqueName: \"kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.890018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.890216 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.890433 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpthb\" (UniqueName: \"kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.890482 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.998571 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwp2n\" (UniqueName: \"kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.998637 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.998704 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.998774 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpthb\" (UniqueName: \"kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:43:59 crc kubenswrapper[4759]: I1205 00:43:59.998797 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.000341 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.002330 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.003038 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.020641 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwp2n\" (UniqueName: \"kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n\") pod \"dnsmasq-dns-675f4bcbfc-knvhq\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.021432 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpthb\" (UniqueName: \"kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb\") pod \"dnsmasq-dns-78dd6ddcc-465n6\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.058858 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.121470 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.536676 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.574343 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" event={"ID":"c9b2f485-455f-45fe-a382-1bdd6b102ad0","Type":"ContainerStarted","Data":"4885da82b743b86a1e78f29c9d815d0b305d16f565fe1790415225e92a9943ab"} Dec 05 00:44:00 crc kubenswrapper[4759]: I1205 00:44:00.651690 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:44:00 crc kubenswrapper[4759]: W1205 00:44:00.660242 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff24588f_2187_4e9a_b054_e6f62b476167.slice/crio-810a5b786c4a109a35327ea2cf40e7c97b596adca9bc237d0060a427b19eb353 WatchSource:0}: Error finding container 810a5b786c4a109a35327ea2cf40e7c97b596adca9bc237d0060a427b19eb353: Status 404 returned error can't find the container with id 810a5b786c4a109a35327ea2cf40e7c97b596adca9bc237d0060a427b19eb353 Dec 05 00:44:01 crc kubenswrapper[4759]: I1205 00:44:01.591659 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" event={"ID":"ff24588f-2187-4e9a-b054-e6f62b476167","Type":"ContainerStarted","Data":"810a5b786c4a109a35327ea2cf40e7c97b596adca9bc237d0060a427b19eb353"} Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.777262 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.798978 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.800465 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.821837 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.950525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbzl\" (UniqueName: \"kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.950646 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:02 crc kubenswrapper[4759]: I1205 00:44:02.950723 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.053226 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbzl\" (UniqueName: \"kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.053296 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.053359 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.054155 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.055050 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.065145 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.090149 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.107868 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.109474 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbzl\" (UniqueName: \"kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl\") pod \"dnsmasq-dns-666b6646f7-rdjwr\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.134475 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.139917 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.272939 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.273033 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.273122 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj776\" (UniqueName: \"kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.374667 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.375060 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.375109 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj776\" (UniqueName: \"kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.377163 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.377882 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.414412 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj776\" (UniqueName: \"kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776\") pod \"dnsmasq-dns-57d769cc4f-g2xqq\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.523824 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.668599 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:03 crc kubenswrapper[4759]: W1205 00:44:03.682099 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf352d2ed_1545_4f6d_8f69_e30a67b4d3c9.slice/crio-f97c75db4ec1e275365a3cf4ee08985fc4ea7fb149a8801cc957259f75496a63 WatchSource:0}: Error finding container f97c75db4ec1e275365a3cf4ee08985fc4ea7fb149a8801cc957259f75496a63: Status 404 returned error can't find the container with id f97c75db4ec1e275365a3cf4ee08985fc4ea7fb149a8801cc957259f75496a63 Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.898360 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.905263 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.907972 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.907985 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9xqwl" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.908057 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.908216 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.908277 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.908374 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.910748 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.920272 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:44:03 crc kubenswrapper[4759]: I1205 00:44:03.980182 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:03 crc kubenswrapper[4759]: W1205 00:44:03.986869 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3be9cea9_8244_4151_8379_0195909a1399.slice/crio-1589121b1f31dc45833b965c7ecd14c7fb44883baed69fcd7c2051a1c7f0733e WatchSource:0}: Error finding container 1589121b1f31dc45833b965c7ecd14c7fb44883baed69fcd7c2051a1c7f0733e: Status 404 returned error can't find the container with id 1589121b1f31dc45833b965c7ecd14c7fb44883baed69fcd7c2051a1c7f0733e Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.086901 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.086955 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087002 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087021 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fbp\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087043 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087068 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087294 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087398 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087431 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087496 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.087536 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189435 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189510 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189552 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189575 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189611 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189740 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189792 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189814 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fbp\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.189843 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.190370 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.190497 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.190953 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.191138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.191704 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.192833 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.195964 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.196385 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.196494 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.207652 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.209983 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fbp\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.227567 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.230640 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.241223 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.241537 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.241916 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sxw7j" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.242057 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.242203 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.242247 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.244724 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.248955 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.253528 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.396065 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.396174 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.397837 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398066 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398093 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398147 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398189 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398227 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398287 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22z8j\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398355 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.398434 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.433898 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.433963 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.499850 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.499901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.499924 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22z8j\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.499958 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.499987 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500009 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500050 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500075 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500102 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500118 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.500150 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.501978 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.502253 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.502422 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.502945 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.504028 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.504100 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.508339 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.510633 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.522189 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.523968 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.529115 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22z8j\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.533083 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.539737 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.614839 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.625745 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" event={"ID":"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9","Type":"ContainerStarted","Data":"f97c75db4ec1e275365a3cf4ee08985fc4ea7fb149a8801cc957259f75496a63"} Dec 05 00:44:04 crc kubenswrapper[4759]: I1205 00:44:04.627171 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" event={"ID":"3be9cea9-8244-4151-8379-0195909a1399","Type":"ContainerStarted","Data":"1589121b1f31dc45833b965c7ecd14c7fb44883baed69fcd7c2051a1c7f0733e"} Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.607870 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.609606 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.614347 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bf2bm" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.614408 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.614916 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.618753 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.623731 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.637040 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725178 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725250 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhgqn\" (UniqueName: \"kubernetes.io/projected/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kube-api-access-bhgqn\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725324 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725349 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-default\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725384 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725415 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725438 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kolla-config\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.725476 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827129 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827207 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-default\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827264 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827332 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827370 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kolla-config\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827410 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827428 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827549 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.827606 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhgqn\" (UniqueName: \"kubernetes.io/projected/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kube-api-access-bhgqn\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.829068 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-default\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.829579 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.833173 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kolla-config\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.833435 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.834275 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.835922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.856164 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhgqn\" (UniqueName: \"kubernetes.io/projected/1013063a-9cdb-47ba-8c7d-5161bbbad9d4-kube-api-access-bhgqn\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.857393 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1013063a-9cdb-47ba-8c7d-5161bbbad9d4\") " pod="openstack/openstack-galera-0" Dec 05 00:44:05 crc kubenswrapper[4759]: I1205 00:44:05.944709 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.074084 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.076287 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.078679 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ldrnr" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.079223 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.082708 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.084442 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.094087 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147400 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dnp\" (UniqueName: \"kubernetes.io/projected/cb3095df-1b95-485e-99b5-6a3886c58ac3-kube-api-access-75dnp\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147478 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147557 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147597 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.147669 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249456 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249679 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249697 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75dnp\" (UniqueName: \"kubernetes.io/projected/cb3095df-1b95-485e-99b5-6a3886c58ac3-kube-api-access-75dnp\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249732 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249790 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249825 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.249982 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.250044 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.252260 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.252403 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.252920 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.253672 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb3095df-1b95-485e-99b5-6a3886c58ac3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.255100 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb3095df-1b95-485e-99b5-6a3886c58ac3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.266402 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.266587 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3095df-1b95-485e-99b5-6a3886c58ac3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.272061 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dnp\" (UniqueName: \"kubernetes.io/projected/cb3095df-1b95-485e-99b5-6a3886c58ac3-kube-api-access-75dnp\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.292129 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cb3095df-1b95-485e-99b5-6a3886c58ac3\") " pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.429162 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.675761 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.677126 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.679584 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.679729 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zvl8k" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.680252 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.702525 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.774727 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpt4\" (UniqueName: \"kubernetes.io/projected/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kube-api-access-kfpt4\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.774808 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.774852 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-config-data\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.774884 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kolla-config\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.775268 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.877352 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.877420 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpt4\" (UniqueName: \"kubernetes.io/projected/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kube-api-access-kfpt4\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.877454 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.877472 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-config-data\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.877494 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kolla-config\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.878410 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kolla-config\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.879140 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf79c940-d58e-4319-94e8-6bacc34b1ae5-config-data\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.882509 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.884916 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf79c940-d58e-4319-94e8-6bacc34b1ae5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.901136 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpt4\" (UniqueName: \"kubernetes.io/projected/cf79c940-d58e-4319-94e8-6bacc34b1ae5-kube-api-access-kfpt4\") pod \"memcached-0\" (UID: \"cf79c940-d58e-4319-94e8-6bacc34b1ae5\") " pod="openstack/memcached-0" Dec 05 00:44:07 crc kubenswrapper[4759]: I1205 00:44:07.993413 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.429644 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.431427 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.433415 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gqtv4" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.439969 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.509857 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8mn7\" (UniqueName: \"kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7\") pod \"kube-state-metrics-0\" (UID: \"9fdfe467-6f94-4139-b67a-73d6b69e7753\") " pod="openstack/kube-state-metrics-0" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.610860 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8mn7\" (UniqueName: \"kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7\") pod \"kube-state-metrics-0\" (UID: \"9fdfe467-6f94-4139-b67a-73d6b69e7753\") " pod="openstack/kube-state-metrics-0" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.645108 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8mn7\" (UniqueName: \"kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7\") pod \"kube-state-metrics-0\" (UID: \"9fdfe467-6f94-4139-b67a-73d6b69e7753\") " pod="openstack/kube-state-metrics-0" Dec 05 00:44:09 crc kubenswrapper[4759]: I1205 00:44:09.770235 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.084160 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.085665 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.090647 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.090837 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-ffhbk" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.094453 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.222643 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrt2\" (UniqueName: \"kubernetes.io/projected/3555f68d-fb68-4cf9-91e0-51cc25d2305c-kube-api-access-tcrt2\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.222772 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3555f68d-fb68-4cf9-91e0-51cc25d2305c-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.323907 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3555f68d-fb68-4cf9-91e0-51cc25d2305c-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.324031 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrt2\" (UniqueName: \"kubernetes.io/projected/3555f68d-fb68-4cf9-91e0-51cc25d2305c-kube-api-access-tcrt2\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.341355 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3555f68d-fb68-4cf9-91e0-51cc25d2305c-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.351663 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrt2\" (UniqueName: \"kubernetes.io/projected/3555f68d-fb68-4cf9-91e0-51cc25d2305c-kube-api-access-tcrt2\") pod \"observability-ui-dashboards-7d5fb4cbfb-q8g69\" (UID: \"3555f68d-fb68-4cf9-91e0-51cc25d2305c\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.403450 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-dc4d54c97-74r7x"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.404490 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.414818 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.434724 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc4d54c97-74r7x"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529383 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529423 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcgq\" (UniqueName: \"kubernetes.io/projected/c398cc02-7157-46ff-b99d-441d94f9b2f5-kube-api-access-bxcgq\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529457 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-oauth-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529475 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-trusted-ca-bundle\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529500 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-oauth-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529543 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-service-ca\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.529568 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631696 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631746 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcgq\" (UniqueName: \"kubernetes.io/projected/c398cc02-7157-46ff-b99d-441d94f9b2f5-kube-api-access-bxcgq\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631784 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-oauth-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631806 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-trusted-ca-bundle\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631836 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-oauth-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631882 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-service-ca\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.631910 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.632619 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.633254 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-oauth-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.633547 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-trusted-ca-bundle\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.633886 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c398cc02-7157-46ff-b99d-441d94f9b2f5-service-ca\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.635447 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-serving-cert\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.645326 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c398cc02-7157-46ff-b99d-441d94f9b2f5-console-oauth-config\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.652522 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcgq\" (UniqueName: \"kubernetes.io/projected/c398cc02-7157-46ff-b99d-441d94f9b2f5-kube-api-access-bxcgq\") pod \"console-dc4d54c97-74r7x\" (UID: \"c398cc02-7157-46ff-b99d-441d94f9b2f5\") " pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.694573 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.697636 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.699065 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-jvk29" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.699644 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.700604 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.700682 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.700774 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.711266 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.712622 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.758497 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861075 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861105 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861144 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861350 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhd88\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861669 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861734 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.861840 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963169 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963241 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhd88\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963330 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963359 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963390 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963426 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963467 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963507 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.963908 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.964904 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.967554 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.970247 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.970476 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.970922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.982003 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhd88\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.982616 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:10 crc kubenswrapper[4759]: I1205 00:44:10.992937 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:11 crc kubenswrapper[4759]: I1205 00:44:11.073764 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.047803 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nw9xk"] Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.049835 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.052550 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zkbkw" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.053352 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.054408 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.055852 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk"] Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.069149 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-nscp4"] Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.071401 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.081449 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nscp4"] Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112466 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkht5\" (UniqueName: \"kubernetes.io/projected/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-kube-api-access-xkht5\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112529 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-log-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112553 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-combined-ca-bundle\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112603 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112626 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-scripts\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112663 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-ovn-controller-tls-certs\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.112699 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.213883 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkht5\" (UniqueName: \"kubernetes.io/projected/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-kube-api-access-xkht5\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.213958 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-log\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.213977 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-combined-ca-bundle\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.213996 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-log-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.214018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-run\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.214136 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.214158 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-scripts\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.214587 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-log-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.214669 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run-ovn\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215030 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-ovn-controller-tls-certs\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215064 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-lib\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215115 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215241 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-var-run\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215133 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87eff283-d384-4578-9a23-0d7dab551aab-scripts\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.215452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-etc-ovs\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.216911 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svc72\" (UniqueName: \"kubernetes.io/projected/87eff283-d384-4578-9a23-0d7dab551aab-kube-api-access-svc72\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.217122 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-scripts\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.219599 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-combined-ca-bundle\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.227225 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-ovn-controller-tls-certs\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.231719 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkht5\" (UniqueName: \"kubernetes.io/projected/d4b47f07-88f8-4a9a-97ee-7c61be8a6235-kube-api-access-xkht5\") pod \"ovn-controller-nw9xk\" (UID: \"d4b47f07-88f8-4a9a-97ee-7c61be8a6235\") " pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318092 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87eff283-d384-4578-9a23-0d7dab551aab-scripts\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318172 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-etc-ovs\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318542 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-etc-ovs\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318582 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svc72\" (UniqueName: \"kubernetes.io/projected/87eff283-d384-4578-9a23-0d7dab551aab-kube-api-access-svc72\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318638 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-log\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318659 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-run\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318713 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-lib\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.318896 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-lib\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.319213 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-log\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.319263 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/87eff283-d384-4578-9a23-0d7dab551aab-var-run\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.320675 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87eff283-d384-4578-9a23-0d7dab551aab-scripts\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.334727 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svc72\" (UniqueName: \"kubernetes.io/projected/87eff283-d384-4578-9a23-0d7dab551aab-kube-api-access-svc72\") pod \"ovn-controller-ovs-nscp4\" (UID: \"87eff283-d384-4578-9a23-0d7dab551aab\") " pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.388088 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:13 crc kubenswrapper[4759]: I1205 00:44:13.396799 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.660524 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.665154 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.669842 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-t52xb" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.670207 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.670469 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.670824 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.671120 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.680583 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748546 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748623 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vvr6\" (UniqueName: \"kubernetes.io/projected/42baeb94-be38-4927-bb0d-9b37877cf412-kube-api-access-5vvr6\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748781 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-config\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748838 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748900 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748917 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748965 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.748996 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851080 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-config\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851179 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851227 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851248 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851273 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851320 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851349 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.851368 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vvr6\" (UniqueName: \"kubernetes.io/projected/42baeb94-be38-4927-bb0d-9b37877cf412-kube-api-access-5vvr6\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.852204 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.852509 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.853339 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-config\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.853726 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42baeb94-be38-4927-bb0d-9b37877cf412-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.857593 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.859607 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.863757 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42baeb94-be38-4927-bb0d-9b37877cf412-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.870813 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vvr6\" (UniqueName: \"kubernetes.io/projected/42baeb94-be38-4927-bb0d-9b37877cf412-kube-api-access-5vvr6\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.900460 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42baeb94-be38-4927-bb0d-9b37877cf412\") " pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:14 crc kubenswrapper[4759]: I1205 00:44:14.998911 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.592237 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.594352 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.596771 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.596822 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-p2w95" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.597662 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.597900 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.630613 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684434 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-config\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684599 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684696 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2svr\" (UniqueName: \"kubernetes.io/projected/a28d6f96-86fb-420c-a292-8c65e0088079-kube-api-access-j2svr\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684749 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684812 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684855 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684899 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.684939 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787280 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787352 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787383 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787406 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787429 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787478 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-config\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787580 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2svr\" (UniqueName: \"kubernetes.io/projected/a28d6f96-86fb-420c-a292-8c65e0088079-kube-api-access-j2svr\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.787761 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.789041 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.789168 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-config\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.789904 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a28d6f96-86fb-420c-a292-8c65e0088079-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.794816 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.795345 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.800060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a28d6f96-86fb-420c-a292-8c65e0088079-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.812378 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2svr\" (UniqueName: \"kubernetes.io/projected/a28d6f96-86fb-420c-a292-8c65e0088079-kube-api-access-j2svr\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.824555 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a28d6f96-86fb-420c-a292-8c65e0088079\") " pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:16 crc kubenswrapper[4759]: I1205 00:44:16.922592 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:19 crc kubenswrapper[4759]: I1205 00:44:19.644974 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.629994 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.630211 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwp2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-knvhq_openstack(c9b2f485-455f-45fe-a382-1bdd6b102ad0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.631896 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" podUID="c9b2f485-455f-45fe-a382-1bdd6b102ad0" Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.675509 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.675693 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpthb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-465n6_openstack(ff24588f-2187-4e9a-b054-e6f62b476167): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:44:20 crc kubenswrapper[4759]: E1205 00:44:20.677650 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" podUID="ff24588f-2187-4e9a-b054-e6f62b476167" Dec 05 00:44:20 crc kubenswrapper[4759]: I1205 00:44:20.822805 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1013063a-9cdb-47ba-8c7d-5161bbbad9d4","Type":"ContainerStarted","Data":"42e78584c41db8b6b83c0eec6bfc11b56a413c7f58b8079a0c3d0323373ef406"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.330289 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.440041 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:44:21 crc kubenswrapper[4759]: W1205 00:44:21.454319 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda699edc7_e60f_482d_962d_6c69d625a1c5.slice/crio-9464d48439fba77e99da0d777e6bcfb47ccaa8c5a1ea0daba13a9a8d8efeeb9b WatchSource:0}: Error finding container 9464d48439fba77e99da0d777e6bcfb47ccaa8c5a1ea0daba13a9a8d8efeeb9b: Status 404 returned error can't find the container with id 9464d48439fba77e99da0d777e6bcfb47ccaa8c5a1ea0daba13a9a8d8efeeb9b Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.457212 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.483661 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.487789 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.585939 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config\") pod \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586013 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config\") pod \"ff24588f-2187-4e9a-b054-e6f62b476167\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586074 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwp2n\" (UniqueName: \"kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n\") pod \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\" (UID: \"c9b2f485-455f-45fe-a382-1bdd6b102ad0\") " Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586128 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc\") pod \"ff24588f-2187-4e9a-b054-e6f62b476167\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586182 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpthb\" (UniqueName: \"kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb\") pod \"ff24588f-2187-4e9a-b054-e6f62b476167\" (UID: \"ff24588f-2187-4e9a-b054-e6f62b476167\") " Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586541 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config" (OuterVolumeSpecName: "config") pod "c9b2f485-455f-45fe-a382-1bdd6b102ad0" (UID: "c9b2f485-455f-45fe-a382-1bdd6b102ad0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586562 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config" (OuterVolumeSpecName: "config") pod "ff24588f-2187-4e9a-b054-e6f62b476167" (UID: "ff24588f-2187-4e9a-b054-e6f62b476167"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.586671 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff24588f-2187-4e9a-b054-e6f62b476167" (UID: "ff24588f-2187-4e9a-b054-e6f62b476167"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.591312 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n" (OuterVolumeSpecName: "kube-api-access-rwp2n") pod "c9b2f485-455f-45fe-a382-1bdd6b102ad0" (UID: "c9b2f485-455f-45fe-a382-1bdd6b102ad0"). InnerVolumeSpecName "kube-api-access-rwp2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.592673 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb" (OuterVolumeSpecName: "kube-api-access-dpthb") pod "ff24588f-2187-4e9a-b054-e6f62b476167" (UID: "ff24588f-2187-4e9a-b054-e6f62b476167"). InnerVolumeSpecName "kube-api-access-dpthb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.687415 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpthb\" (UniqueName: \"kubernetes.io/projected/ff24588f-2187-4e9a-b054-e6f62b476167-kube-api-access-dpthb\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.687448 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b2f485-455f-45fe-a382-1bdd6b102ad0-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.687458 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.687466 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwp2n\" (UniqueName: \"kubernetes.io/projected/c9b2f485-455f-45fe-a382-1bdd6b102ad0-kube-api-access-rwp2n\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.687475 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff24588f-2187-4e9a-b054-e6f62b476167-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.831236 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fdfe467-6f94-4139-b67a-73d6b69e7753","Type":"ContainerStarted","Data":"e9dd9604bef12a4a9221d424b12fad29eb2aaad5a9e8521566271ba5c612204e"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.835660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" event={"ID":"ff24588f-2187-4e9a-b054-e6f62b476167","Type":"ContainerDied","Data":"810a5b786c4a109a35327ea2cf40e7c97b596adca9bc237d0060a427b19eb353"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.835679 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-465n6" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.837197 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerStarted","Data":"9464d48439fba77e99da0d777e6bcfb47ccaa8c5a1ea0daba13a9a8d8efeeb9b"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.840563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" event={"ID":"c9b2f485-455f-45fe-a382-1bdd6b102ad0","Type":"ContainerDied","Data":"4885da82b743b86a1e78f29c9d815d0b305d16f565fe1790415225e92a9943ab"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.840596 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-knvhq" Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.844886 4759 generic.go:334] "Generic (PLEG): container finished" podID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerID="410d821248d3bd1d09a5849e79ac5fffc12bff5d8398579d31f87ae21711db38" exitCode=0 Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.844928 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" event={"ID":"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9","Type":"ContainerDied","Data":"410d821248d3bd1d09a5849e79ac5fffc12bff5d8398579d31f87ae21711db38"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.851007 4759 generic.go:334] "Generic (PLEG): container finished" podID="3be9cea9-8244-4151-8379-0195909a1399" containerID="7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b" exitCode=0 Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.851059 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" event={"ID":"3be9cea9-8244-4151-8379-0195909a1399","Type":"ContainerDied","Data":"7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b"} Dec 05 00:44:21 crc kubenswrapper[4759]: I1205 00:44:21.855123 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerStarted","Data":"93fba10ffa67c0221fb64c3463234950e2c4b272b2f60bfa2adf190698baaeb4"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.028028 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.049284 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-465n6"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.066528 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.074553 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.085573 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:44:22 crc kubenswrapper[4759]: W1205 00:44:22.100116 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3555f68d_fb68_4cf9_91e0_51cc25d2305c.slice/crio-2f767093d9fb31e6a2c3d5032481f107372fd22dfbcfa57d0f9286949ce04af1 WatchSource:0}: Error finding container 2f767093d9fb31e6a2c3d5032481f107372fd22dfbcfa57d0f9286949ce04af1: Status 404 returned error can't find the container with id 2f767093d9fb31e6a2c3d5032481f107372fd22dfbcfa57d0f9286949ce04af1 Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.105167 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.116497 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.123587 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-knvhq"] Dec 05 00:44:22 crc kubenswrapper[4759]: W1205 00:44:22.134673 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4b47f07_88f8_4a9a_97ee_7c61be8a6235.slice/crio-0597f196d28fdf77685591af0a1d313114bafbd768b893bdd1fc85fc07e2fe86 WatchSource:0}: Error finding container 0597f196d28fdf77685591af0a1d313114bafbd768b893bdd1fc85fc07e2fe86: Status 404 returned error can't find the container with id 0597f196d28fdf77685591af0a1d313114bafbd768b893bdd1fc85fc07e2fe86 Dec 05 00:44:22 crc kubenswrapper[4759]: W1205 00:44:22.142894 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3095df_1b95_485e_99b5_6a3886c58ac3.slice/crio-ac190a362b7d0ce86921846f7f3f314025cc22881ce2a753efe30e946cb1e8b1 WatchSource:0}: Error finding container ac190a362b7d0ce86921846f7f3f314025cc22881ce2a753efe30e946cb1e8b1: Status 404 returned error can't find the container with id ac190a362b7d0ce86921846f7f3f314025cc22881ce2a753efe30e946cb1e8b1 Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.147042 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.157338 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc4d54c97-74r7x"] Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.336041 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nscp4"] Dec 05 00:44:22 crc kubenswrapper[4759]: W1205 00:44:22.713552 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87eff283_d384_4578_9a23_0d7dab551aab.slice/crio-596524585e52e4c1fe18218a07b2b13fbaa9053f2d0302c273d9031b011c1e79 WatchSource:0}: Error finding container 596524585e52e4c1fe18218a07b2b13fbaa9053f2d0302c273d9031b011c1e79: Status 404 returned error can't find the container with id 596524585e52e4c1fe18218a07b2b13fbaa9053f2d0302c273d9031b011c1e79 Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.869510 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk" event={"ID":"d4b47f07-88f8-4a9a-97ee-7c61be8a6235","Type":"ContainerStarted","Data":"0597f196d28fdf77685591af0a1d313114bafbd768b893bdd1fc85fc07e2fe86"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.871950 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerStarted","Data":"a1465f49b94d3a279c12de5be1cfbd3addbf482cea648523cc4fe4d63455be09"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.873586 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf79c940-d58e-4319-94e8-6bacc34b1ae5","Type":"ContainerStarted","Data":"ffaccee254fd28f1256c6573bf5be94ca681f137b50f34d97def6ab716c832e8"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.874659 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb3095df-1b95-485e-99b5-6a3886c58ac3","Type":"ContainerStarted","Data":"ac190a362b7d0ce86921846f7f3f314025cc22881ce2a753efe30e946cb1e8b1"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.876009 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nscp4" event={"ID":"87eff283-d384-4578-9a23-0d7dab551aab","Type":"ContainerStarted","Data":"596524585e52e4c1fe18218a07b2b13fbaa9053f2d0302c273d9031b011c1e79"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.877887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc4d54c97-74r7x" event={"ID":"c398cc02-7157-46ff-b99d-441d94f9b2f5","Type":"ContainerStarted","Data":"06c8857acbfdcf678238f5bcc1d9ace6190e12c6030fa1d566a312eadadee8b6"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.877936 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc4d54c97-74r7x" event={"ID":"c398cc02-7157-46ff-b99d-441d94f9b2f5","Type":"ContainerStarted","Data":"3b5759e015a48a5fa311ac93752ea9ec7fee533b454f0aa09a0eb8f5ba59bc33"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.884188 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" event={"ID":"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9","Type":"ContainerStarted","Data":"c9a20c795612542b760f8389c459ce088fe2a93b4d9effd123065cf767530412"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.884260 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.892392 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" event={"ID":"3555f68d-fb68-4cf9-91e0-51cc25d2305c","Type":"ContainerStarted","Data":"2f767093d9fb31e6a2c3d5032481f107372fd22dfbcfa57d0f9286949ce04af1"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.901073 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dc4d54c97-74r7x" podStartSLOduration=12.901053807 podStartE2EDuration="12.901053807s" podCreationTimestamp="2025-12-05 00:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:44:22.894660659 +0000 UTC m=+1282.110321619" watchObservedRunningTime="2025-12-05 00:44:22.901053807 +0000 UTC m=+1282.116714757" Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.909278 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" event={"ID":"3be9cea9-8244-4151-8379-0195909a1399","Type":"ContainerStarted","Data":"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d"} Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.909454 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.941180 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" podStartSLOduration=3.680490389 podStartE2EDuration="20.941160514s" podCreationTimestamp="2025-12-05 00:44:02 +0000 UTC" firstStartedPulling="2025-12-05 00:44:03.685114607 +0000 UTC m=+1262.900775557" lastFinishedPulling="2025-12-05 00:44:20.945784732 +0000 UTC m=+1280.161445682" observedRunningTime="2025-12-05 00:44:22.913571225 +0000 UTC m=+1282.129232175" watchObservedRunningTime="2025-12-05 00:44:22.941160514 +0000 UTC m=+1282.156821464" Dec 05 00:44:22 crc kubenswrapper[4759]: I1205 00:44:22.947777 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" podStartSLOduration=3.017488089 podStartE2EDuration="19.947766727s" podCreationTimestamp="2025-12-05 00:44:03 +0000 UTC" firstStartedPulling="2025-12-05 00:44:03.989384841 +0000 UTC m=+1263.205045791" lastFinishedPulling="2025-12-05 00:44:20.919663479 +0000 UTC m=+1280.135324429" observedRunningTime="2025-12-05 00:44:22.929002395 +0000 UTC m=+1282.144663345" watchObservedRunningTime="2025-12-05 00:44:22.947766727 +0000 UTC m=+1282.163427677" Dec 05 00:44:23 crc kubenswrapper[4759]: I1205 00:44:23.016515 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 00:44:23 crc kubenswrapper[4759]: W1205 00:44:23.133552 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42baeb94_be38_4927_bb0d_9b37877cf412.slice/crio-7679b1efbe1e5f51a81c18ab36dae37856097b3b614b1f72401358ce893ce161 WatchSource:0}: Error finding container 7679b1efbe1e5f51a81c18ab36dae37856097b3b614b1f72401358ce893ce161: Status 404 returned error can't find the container with id 7679b1efbe1e5f51a81c18ab36dae37856097b3b614b1f72401358ce893ce161 Dec 05 00:44:23 crc kubenswrapper[4759]: I1205 00:44:23.170018 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b2f485-455f-45fe-a382-1bdd6b102ad0" path="/var/lib/kubelet/pods/c9b2f485-455f-45fe-a382-1bdd6b102ad0/volumes" Dec 05 00:44:23 crc kubenswrapper[4759]: I1205 00:44:23.170513 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff24588f-2187-4e9a-b054-e6f62b476167" path="/var/lib/kubelet/pods/ff24588f-2187-4e9a-b054-e6f62b476167/volumes" Dec 05 00:44:23 crc kubenswrapper[4759]: I1205 00:44:23.251015 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 00:44:23 crc kubenswrapper[4759]: I1205 00:44:23.922910 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"42baeb94-be38-4927-bb0d-9b37877cf412","Type":"ContainerStarted","Data":"7679b1efbe1e5f51a81c18ab36dae37856097b3b614b1f72401358ce893ce161"} Dec 05 00:44:28 crc kubenswrapper[4759]: I1205 00:44:28.137595 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:28 crc kubenswrapper[4759]: I1205 00:44:28.525682 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:28 crc kubenswrapper[4759]: I1205 00:44:28.585105 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:28 crc kubenswrapper[4759]: I1205 00:44:28.976039 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="dnsmasq-dns" containerID="cri-o://c9a20c795612542b760f8389c459ce088fe2a93b4d9effd123065cf767530412" gracePeriod=10 Dec 05 00:44:29 crc kubenswrapper[4759]: I1205 00:44:29.987005 4759 generic.go:334] "Generic (PLEG): container finished" podID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerID="c9a20c795612542b760f8389c459ce088fe2a93b4d9effd123065cf767530412" exitCode=0 Dec 05 00:44:29 crc kubenswrapper[4759]: I1205 00:44:29.987079 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" event={"ID":"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9","Type":"ContainerDied","Data":"c9a20c795612542b760f8389c459ce088fe2a93b4d9effd123065cf767530412"} Dec 05 00:44:30 crc kubenswrapper[4759]: I1205 00:44:30.759981 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:30 crc kubenswrapper[4759]: I1205 00:44:30.760822 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:30 crc kubenswrapper[4759]: I1205 00:44:30.765200 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:31 crc kubenswrapper[4759]: I1205 00:44:31.001851 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a28d6f96-86fb-420c-a292-8c65e0088079","Type":"ContainerStarted","Data":"84839738c42d330e0169008a4b8f4db8d356e093fcdb73954b3458ae4e428661"} Dec 05 00:44:31 crc kubenswrapper[4759]: I1205 00:44:31.006381 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dc4d54c97-74r7x" Dec 05 00:44:31 crc kubenswrapper[4759]: I1205 00:44:31.085654 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:44:33 crc kubenswrapper[4759]: I1205 00:44:33.136940 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Dec 05 00:44:33 crc kubenswrapper[4759]: E1205 00:44:33.785213 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 00:44:33 crc kubenswrapper[4759]: E1205 00:44:33.785522 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhgqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(1013063a-9cdb-47ba-8c7d-5161bbbad9d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:44:33 crc kubenswrapper[4759]: E1205 00:44:33.787636 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="1013063a-9cdb-47ba-8c7d-5161bbbad9d4" Dec 05 00:44:34 crc kubenswrapper[4759]: E1205 00:44:34.046931 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="1013063a-9cdb-47ba-8c7d-5161bbbad9d4" Dec 05 00:44:34 crc kubenswrapper[4759]: I1205 00:44:34.433274 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:44:34 crc kubenswrapper[4759]: I1205 00:44:34.433368 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.487671 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6qct9"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.489236 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.491757 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.496965 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovn-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.497039 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.497096 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a66884c-5b7a-4462-8e9d-668a97883211-config\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.497127 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-combined-ca-bundle\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.497157 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khxfb\" (UniqueName: \"kubernetes.io/projected/0a66884c-5b7a-4462-8e9d-668a97883211-kube-api-access-khxfb\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.497192 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovs-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.501292 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6qct9"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607093 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovn-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607443 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607482 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a66884c-5b7a-4462-8e9d-668a97883211-config\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607508 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-combined-ca-bundle\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khxfb\" (UniqueName: \"kubernetes.io/projected/0a66884c-5b7a-4462-8e9d-668a97883211-kube-api-access-khxfb\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607567 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovs-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607885 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovs-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.607935 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a66884c-5b7a-4462-8e9d-668a97883211-ovn-rundir\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.611828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a66884c-5b7a-4462-8e9d-668a97883211-config\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.624201 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.630344 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khxfb\" (UniqueName: \"kubernetes.io/projected/0a66884c-5b7a-4462-8e9d-668a97883211-kube-api-access-khxfb\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.640887 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a66884c-5b7a-4462-8e9d-668a97883211-combined-ca-bundle\") pod \"ovn-controller-metrics-6qct9\" (UID: \"0a66884c-5b7a-4462-8e9d-668a97883211\") " pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.686702 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmr94"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.693043 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.722608 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.759814 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmr94"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.821423 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6qct9" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.825419 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.825542 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.825722 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9h9b\" (UniqueName: \"kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.825814 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.842612 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmr94"] Dec 05 00:44:35 crc kubenswrapper[4759]: E1205 00:44:35.849412 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-l9h9b ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" podUID="69c2e03b-eaa7-45b8-9856-1f53820fb137" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.900900 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.902672 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.905463 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.912924 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.928152 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.928212 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9h9b\" (UniqueName: \"kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.928256 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.928626 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.929252 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.929334 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.929417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:35 crc kubenswrapper[4759]: I1205 00:44:35.951354 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9h9b\" (UniqueName: \"kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b\") pod \"dnsmasq-dns-5bf47b49b7-tmr94\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.030545 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.030607 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.030756 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.030809 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.030846 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2mh4\" (UniqueName: \"kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.059365 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.072729 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.132505 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.132570 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.132600 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2mh4\" (UniqueName: \"kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.132643 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.132668 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.133421 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.133482 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.133637 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.134409 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.152538 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2mh4\" (UniqueName: \"kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4\") pod \"dnsmasq-dns-8554648995-qn796\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.223372 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.234558 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc\") pod \"69c2e03b-eaa7-45b8-9856-1f53820fb137\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.234623 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config\") pod \"69c2e03b-eaa7-45b8-9856-1f53820fb137\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.234749 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9h9b\" (UniqueName: \"kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b\") pod \"69c2e03b-eaa7-45b8-9856-1f53820fb137\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.234849 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb\") pod \"69c2e03b-eaa7-45b8-9856-1f53820fb137\" (UID: \"69c2e03b-eaa7-45b8-9856-1f53820fb137\") " Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.235069 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69c2e03b-eaa7-45b8-9856-1f53820fb137" (UID: "69c2e03b-eaa7-45b8-9856-1f53820fb137"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.235142 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config" (OuterVolumeSpecName: "config") pod "69c2e03b-eaa7-45b8-9856-1f53820fb137" (UID: "69c2e03b-eaa7-45b8-9856-1f53820fb137"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.235352 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69c2e03b-eaa7-45b8-9856-1f53820fb137" (UID: "69c2e03b-eaa7-45b8-9856-1f53820fb137"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.235380 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.235398 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.238261 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b" (OuterVolumeSpecName: "kube-api-access-l9h9b") pod "69c2e03b-eaa7-45b8-9856-1f53820fb137" (UID: "69c2e03b-eaa7-45b8-9856-1f53820fb137"). InnerVolumeSpecName "kube-api-access-l9h9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.337249 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9h9b\" (UniqueName: \"kubernetes.io/projected/69c2e03b-eaa7-45b8-9856-1f53820fb137-kube-api-access-l9h9b\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:36 crc kubenswrapper[4759]: I1205 00:44:36.337278 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69c2e03b-eaa7-45b8-9856-1f53820fb137-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:36 crc kubenswrapper[4759]: E1205 00:44:36.671746 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 00:44:36 crc kubenswrapper[4759]: E1205 00:44:36.671934 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75dnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(cb3095df-1b95-485e-99b5-6a3886c58ac3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:44:36 crc kubenswrapper[4759]: E1205 00:44:36.673211 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="cb3095df-1b95-485e-99b5-6a3886c58ac3" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.067932 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmr94" Dec 05 00:44:37 crc kubenswrapper[4759]: E1205 00:44:37.070286 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="cb3095df-1b95-485e-99b5-6a3886c58ac3" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.137426 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmr94"] Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.150353 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmr94"] Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.170127 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c2e03b-eaa7-45b8-9856-1f53820fb137" path="/var/lib/kubelet/pods/69c2e03b-eaa7-45b8-9856-1f53820fb137/volumes" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.327227 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.459334 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config\") pod \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.459395 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbzl\" (UniqueName: \"kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl\") pod \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.459447 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc\") pod \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\" (UID: \"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9\") " Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.466889 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl" (OuterVolumeSpecName: "kube-api-access-kfbzl") pod "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" (UID: "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9"). InnerVolumeSpecName "kube-api-access-kfbzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.500185 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config" (OuterVolumeSpecName: "config") pod "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" (UID: "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.503082 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" (UID: "f352d2ed-1545-4f6d-8f69-e30a67b4d3c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.561624 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.561658 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbzl\" (UniqueName: \"kubernetes.io/projected/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-kube-api-access-kfbzl\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:37 crc kubenswrapper[4759]: I1205 00:44:37.561670 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.112821 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" event={"ID":"f352d2ed-1545-4f6d-8f69-e30a67b4d3c9","Type":"ContainerDied","Data":"f97c75db4ec1e275365a3cf4ee08985fc4ea7fb149a8801cc957259f75496a63"} Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.113283 4759 scope.go:117] "RemoveContainer" containerID="c9a20c795612542b760f8389c459ce088fe2a93b4d9effd123065cf767530412" Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.113556 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rdjwr" Dec 05 00:44:38 crc kubenswrapper[4759]: E1205 00:44:38.124593 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 00:44:38 crc kubenswrapper[4759]: E1205 00:44:38.124634 4759 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 00:44:38 crc kubenswrapper[4759]: E1205 00:44:38.124767 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h8mn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(9fdfe467-6f94-4139-b67a-73d6b69e7753): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 00:44:38 crc kubenswrapper[4759]: E1205 00:44:38.126071 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.191097 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.210783 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rdjwr"] Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.218475 4759 scope.go:117] "RemoveContainer" containerID="410d821248d3bd1d09a5849e79ac5fffc12bff5d8398579d31f87ae21711db38" Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.642246 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6qct9"] Dec 05 00:44:38 crc kubenswrapper[4759]: I1205 00:44:38.726028 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:44:38 crc kubenswrapper[4759]: W1205 00:44:38.735122 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5ca45a_c445_4e90_a581_f16d0c3654fd.slice/crio-3dac38c83da8fcd92e2fcb977cf600606e8990a8e659bf6c774f0b2170ef2e28 WatchSource:0}: Error finding container 3dac38c83da8fcd92e2fcb977cf600606e8990a8e659bf6c774f0b2170ef2e28: Status 404 returned error can't find the container with id 3dac38c83da8fcd92e2fcb977cf600606e8990a8e659bf6c774f0b2170ef2e28 Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.120853 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerID="9b7c0a04fa2c95300acbe598247bef174076297f4474c50acbc22ec8002beafd" exitCode=0 Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.120914 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-qn796" event={"ID":"8a5ca45a-c445-4e90-a581-f16d0c3654fd","Type":"ContainerDied","Data":"9b7c0a04fa2c95300acbe598247bef174076297f4474c50acbc22ec8002beafd"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.120964 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-qn796" event={"ID":"8a5ca45a-c445-4e90-a581-f16d0c3654fd","Type":"ContainerStarted","Data":"3dac38c83da8fcd92e2fcb977cf600606e8990a8e659bf6c774f0b2170ef2e28"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.123515 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nscp4" event={"ID":"87eff283-d384-4578-9a23-0d7dab551aab","Type":"ContainerStarted","Data":"ba20f801692dd6f8464fe4c1409407f051be8e9dafe80c1b2550247933e608d7"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.125131 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"42baeb94-be38-4927-bb0d-9b37877cf412","Type":"ContainerStarted","Data":"3f988a48c74068d881fdfdd2e7026c72e88df2d050e7c1cb35da8bd5cf56ab13"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.127643 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf79c940-d58e-4319-94e8-6bacc34b1ae5","Type":"ContainerStarted","Data":"7cf986c12cb06eaa386ff5ff5a127f534fb66c7d7af7a23bf085b7f362ebe1f8"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.127780 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.129964 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" event={"ID":"3555f68d-fb68-4cf9-91e0-51cc25d2305c","Type":"ContainerStarted","Data":"91864a691f79c9b1e43743e9996d2c29b6e033422f23b31b50889cc12aec9248"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.132551 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk" event={"ID":"d4b47f07-88f8-4a9a-97ee-7c61be8a6235","Type":"ContainerStarted","Data":"7f5121a195fadec4120e3a3729262e71d7b34a1ab3bcacc1da68beb2f203c13d"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.132762 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nw9xk" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.134154 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6qct9" event={"ID":"0a66884c-5b7a-4462-8e9d-668a97883211","Type":"ContainerStarted","Data":"af222f8f4bbcde8cc4e2d6cd3f33592899e11b7fe59000e1dcabaf70636027e7"} Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.139815 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a28d6f96-86fb-420c-a292-8c65e0088079","Type":"ContainerStarted","Data":"d9c90e9d375c1472714aa0aa3b0fce3c1c47f68b8cd8ceda03ccc1c6be4112f7"} Dec 05 00:44:39 crc kubenswrapper[4759]: E1205 00:44:39.143438 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.179256 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" path="/var/lib/kubelet/pods/f352d2ed-1545-4f6d-8f69-e30a67b4d3c9/volumes" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.222503 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-q8g69" podStartSLOduration=14.220086502 podStartE2EDuration="29.222472848s" podCreationTimestamp="2025-12-05 00:44:10 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.105389041 +0000 UTC m=+1281.321049991" lastFinishedPulling="2025-12-05 00:44:37.107775387 +0000 UTC m=+1296.323436337" observedRunningTime="2025-12-05 00:44:39.211610661 +0000 UTC m=+1298.427271631" watchObservedRunningTime="2025-12-05 00:44:39.222472848 +0000 UTC m=+1298.438133788" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.302191 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nw9xk" podStartSLOduration=11.205806474 podStartE2EDuration="26.302170025s" podCreationTimestamp="2025-12-05 00:44:13 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.143013977 +0000 UTC m=+1281.358674927" lastFinishedPulling="2025-12-05 00:44:37.239377528 +0000 UTC m=+1296.455038478" observedRunningTime="2025-12-05 00:44:39.234966585 +0000 UTC m=+1298.450627555" watchObservedRunningTime="2025-12-05 00:44:39.302170025 +0000 UTC m=+1298.517830975" Dec 05 00:44:39 crc kubenswrapper[4759]: I1205 00:44:39.313468 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.205739901 podStartE2EDuration="32.313456292s" podCreationTimestamp="2025-12-05 00:44:07 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.131188406 +0000 UTC m=+1281.346849356" lastFinishedPulling="2025-12-05 00:44:37.238904797 +0000 UTC m=+1296.454565747" observedRunningTime="2025-12-05 00:44:39.284556852 +0000 UTC m=+1298.500217792" watchObservedRunningTime="2025-12-05 00:44:39.313456292 +0000 UTC m=+1298.529117242" Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.164725 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerStarted","Data":"60e223bb18ce845cab8ddf921004958f044d69a60364af0d0f26073750759ea4"} Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.167717 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerStarted","Data":"0a18484c4c14989b6cd1995e3595bcffb8b46d6d995e6356081f751cb7815390"} Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.170410 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-qn796" event={"ID":"8a5ca45a-c445-4e90-a581-f16d0c3654fd","Type":"ContainerStarted","Data":"71de650ddeb45c3580ef7d26d1b67cdd7f6f9cbf5f77ad1613800b9402ad91d5"} Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.170583 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.174293 4759 generic.go:334] "Generic (PLEG): container finished" podID="87eff283-d384-4578-9a23-0d7dab551aab" containerID="ba20f801692dd6f8464fe4c1409407f051be8e9dafe80c1b2550247933e608d7" exitCode=0 Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.174348 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nscp4" event={"ID":"87eff283-d384-4578-9a23-0d7dab551aab","Type":"ContainerDied","Data":"ba20f801692dd6f8464fe4c1409407f051be8e9dafe80c1b2550247933e608d7"} Dec 05 00:44:40 crc kubenswrapper[4759]: I1205 00:44:40.206593 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-qn796" podStartSLOduration=5.206572729 podStartE2EDuration="5.206572729s" podCreationTimestamp="2025-12-05 00:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:44:40.206125597 +0000 UTC m=+1299.421786547" watchObservedRunningTime="2025-12-05 00:44:40.206572729 +0000 UTC m=+1299.422233679" Dec 05 00:44:41 crc kubenswrapper[4759]: I1205 00:44:41.191056 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerStarted","Data":"beaf554dd9ed722fdef00beb81326a1f73be3c578c893d0dc8a1084ae8d3dcd0"} Dec 05 00:44:42 crc kubenswrapper[4759]: I1205 00:44:42.201014 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nscp4" event={"ID":"87eff283-d384-4578-9a23-0d7dab551aab","Type":"ContainerStarted","Data":"5ab0ffb0d36ba9bcf50e6f15eea9cde37db5ec4b91c921efc718ceb7775ea005"} Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.215005 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6qct9" event={"ID":"0a66884c-5b7a-4462-8e9d-668a97883211","Type":"ContainerStarted","Data":"03d037eaee72f5f60c64da776e236fe9b4cc5ef453efffd09eb93ab54e62b8ff"} Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.220528 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a28d6f96-86fb-420c-a292-8c65e0088079","Type":"ContainerStarted","Data":"ee556b0c7a84db8c5eed317e8a65bf335b92669c9f85d07e8cc00b5cbb77ae7c"} Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.229691 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nscp4" event={"ID":"87eff283-d384-4578-9a23-0d7dab551aab","Type":"ContainerStarted","Data":"923ac1f00670b4749ba9c307ab033283e1d0807b9968583b911adc2204f5cdf8"} Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.230848 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.231011 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.235292 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"42baeb94-be38-4927-bb0d-9b37877cf412","Type":"ContainerStarted","Data":"a2830f03532578c043182c95090ff05b3b9f2ca31356d400f9d3697277c2d80c"} Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.254042 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6qct9" podStartSLOduration=4.924600107 podStartE2EDuration="8.254019537s" podCreationTimestamp="2025-12-05 00:44:35 +0000 UTC" firstStartedPulling="2025-12-05 00:44:38.649490561 +0000 UTC m=+1297.865151511" lastFinishedPulling="2025-12-05 00:44:41.978909991 +0000 UTC m=+1301.194570941" observedRunningTime="2025-12-05 00:44:43.250821698 +0000 UTC m=+1302.466482668" watchObservedRunningTime="2025-12-05 00:44:43.254019537 +0000 UTC m=+1302.469680507" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.323929 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.979119507 podStartE2EDuration="28.323904913s" podCreationTimestamp="2025-12-05 00:44:15 +0000 UTC" firstStartedPulling="2025-12-05 00:44:30.63064936 +0000 UTC m=+1289.846310310" lastFinishedPulling="2025-12-05 00:44:41.975434766 +0000 UTC m=+1301.191095716" observedRunningTime="2025-12-05 00:44:43.280049166 +0000 UTC m=+1302.495710116" watchObservedRunningTime="2025-12-05 00:44:43.323904913 +0000 UTC m=+1302.539565863" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.345753 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-nscp4" podStartSLOduration=15.821459796 podStartE2EDuration="30.345727839s" podCreationTimestamp="2025-12-05 00:44:13 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.715706041 +0000 UTC m=+1281.931366981" lastFinishedPulling="2025-12-05 00:44:37.239974064 +0000 UTC m=+1296.455635024" observedRunningTime="2025-12-05 00:44:43.318548271 +0000 UTC m=+1302.534209251" watchObservedRunningTime="2025-12-05 00:44:43.345727839 +0000 UTC m=+1302.561388809" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.352087 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.496075034 podStartE2EDuration="30.352061244s" podCreationTimestamp="2025-12-05 00:44:13 +0000 UTC" firstStartedPulling="2025-12-05 00:44:23.136797032 +0000 UTC m=+1282.352457982" lastFinishedPulling="2025-12-05 00:44:41.992783242 +0000 UTC m=+1301.208444192" observedRunningTime="2025-12-05 00:44:43.344209551 +0000 UTC m=+1302.559870501" watchObservedRunningTime="2025-12-05 00:44:43.352061244 +0000 UTC m=+1302.567722214" Dec 05 00:44:43 crc kubenswrapper[4759]: I1205 00:44:43.922892 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:44 crc kubenswrapper[4759]: I1205 00:44:44.001018 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:44 crc kubenswrapper[4759]: I1205 00:44:44.250686 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:44 crc kubenswrapper[4759]: I1205 00:44:44.317497 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.000700 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.000787 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.068714 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.346335 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.546496 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 00:44:45 crc kubenswrapper[4759]: E1205 00:44:45.547018 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="init" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.547057 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="init" Dec 05 00:44:45 crc kubenswrapper[4759]: E1205 00:44:45.547108 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="dnsmasq-dns" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.547123 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="dnsmasq-dns" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.550896 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f352d2ed-1545-4f6d-8f69-e30a67b4d3c9" containerName="dnsmasq-dns" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.552620 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.559707 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.559707 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.559803 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.559841 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xxfd9" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.578471 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675423 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675565 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675629 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-scripts\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675667 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675810 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675894 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-config\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.675915 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5phv\" (UniqueName: \"kubernetes.io/projected/e813fe2b-c789-4c1a-89be-65e269dd6d17-kube-api-access-n5phv\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778070 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-scripts\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778446 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778522 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778564 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5phv\" (UniqueName: \"kubernetes.io/projected/e813fe2b-c789-4c1a-89be-65e269dd6d17-kube-api-access-n5phv\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778588 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-config\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778713 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778754 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.778923 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-scripts\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.779228 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.779463 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e813fe2b-c789-4c1a-89be-65e269dd6d17-config\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.784126 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.784295 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.786088 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e813fe2b-c789-4c1a-89be-65e269dd6d17-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.824027 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5phv\" (UniqueName: \"kubernetes.io/projected/e813fe2b-c789-4c1a-89be-65e269dd6d17-kube-api-access-n5phv\") pod \"ovn-northd-0\" (UID: \"e813fe2b-c789-4c1a-89be-65e269dd6d17\") " pod="openstack/ovn-northd-0" Dec 05 00:44:45 crc kubenswrapper[4759]: I1205 00:44:45.887336 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.225434 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.275300 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.275569 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="dnsmasq-dns" containerID="cri-o://91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d" gracePeriod=10 Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.373834 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.718002 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.796427 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj776\" (UniqueName: \"kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776\") pod \"3be9cea9-8244-4151-8379-0195909a1399\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.796506 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config\") pod \"3be9cea9-8244-4151-8379-0195909a1399\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.796594 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc\") pod \"3be9cea9-8244-4151-8379-0195909a1399\" (UID: \"3be9cea9-8244-4151-8379-0195909a1399\") " Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.801268 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776" (OuterVolumeSpecName: "kube-api-access-wj776") pod "3be9cea9-8244-4151-8379-0195909a1399" (UID: "3be9cea9-8244-4151-8379-0195909a1399"). InnerVolumeSpecName "kube-api-access-wj776". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.834348 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config" (OuterVolumeSpecName: "config") pod "3be9cea9-8244-4151-8379-0195909a1399" (UID: "3be9cea9-8244-4151-8379-0195909a1399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.852046 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3be9cea9-8244-4151-8379-0195909a1399" (UID: "3be9cea9-8244-4151-8379-0195909a1399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.900245 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj776\" (UniqueName: \"kubernetes.io/projected/3be9cea9-8244-4151-8379-0195909a1399-kube-api-access-wj776\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.900283 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:46 crc kubenswrapper[4759]: I1205 00:44:46.900297 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3be9cea9-8244-4151-8379-0195909a1399-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.287263 4759 generic.go:334] "Generic (PLEG): container finished" podID="3be9cea9-8244-4151-8379-0195909a1399" containerID="91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d" exitCode=0 Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.287352 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" event={"ID":"3be9cea9-8244-4151-8379-0195909a1399","Type":"ContainerDied","Data":"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d"} Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.287373 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.287410 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g2xqq" event={"ID":"3be9cea9-8244-4151-8379-0195909a1399","Type":"ContainerDied","Data":"1589121b1f31dc45833b965c7ecd14c7fb44883baed69fcd7c2051a1c7f0733e"} Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.287431 4759 scope.go:117] "RemoveContainer" containerID="91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.291880 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e813fe2b-c789-4c1a-89be-65e269dd6d17","Type":"ContainerStarted","Data":"04e7e802a49538b107e9a1c4d699ed49535dbfbda01f14caeaa3ebda7310e932"} Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.314965 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.327076 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g2xqq"] Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.445668 4759 scope.go:117] "RemoveContainer" containerID="7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.502121 4759 scope.go:117] "RemoveContainer" containerID="91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d" Dec 05 00:44:47 crc kubenswrapper[4759]: E1205 00:44:47.502694 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d\": container with ID starting with 91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d not found: ID does not exist" containerID="91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.502744 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d"} err="failed to get container status \"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d\": rpc error: code = NotFound desc = could not find container \"91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d\": container with ID starting with 91baf320cfcc9e8ee6fd6fbb9fbb3abd4d9f23108eebaf4cedfc7fd2e9c0b93d not found: ID does not exist" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.502782 4759 scope.go:117] "RemoveContainer" containerID="7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b" Dec 05 00:44:47 crc kubenswrapper[4759]: E1205 00:44:47.503155 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b\": container with ID starting with 7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b not found: ID does not exist" containerID="7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.503217 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b"} err="failed to get container status \"7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b\": rpc error: code = NotFound desc = could not find container \"7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b\": container with ID starting with 7138cd41fd191aa61271fda180b6a63fc77b0a05e8601f1d506ae4bedb21c54b not found: ID does not exist" Dec 05 00:44:47 crc kubenswrapper[4759]: I1205 00:44:47.994428 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 00:44:48 crc kubenswrapper[4759]: I1205 00:44:48.308427 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e813fe2b-c789-4c1a-89be-65e269dd6d17","Type":"ContainerStarted","Data":"5338975647e8a050073c73e04f9c61485040e7e3a12d24271c2b423b3711994f"} Dec 05 00:44:48 crc kubenswrapper[4759]: I1205 00:44:48.308947 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 00:44:48 crc kubenswrapper[4759]: I1205 00:44:48.308969 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e813fe2b-c789-4c1a-89be-65e269dd6d17","Type":"ContainerStarted","Data":"4b4796561d9f620bb53c2ee6cdc0f08537bedcb2d0c7d05ba5723065c328a5b4"} Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.167964 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be9cea9-8244-4151-8379-0195909a1399" path="/var/lib/kubelet/pods/3be9cea9-8244-4151-8379-0195909a1399/volumes" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.186601 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.063141256 podStartE2EDuration="4.186576777s" podCreationTimestamp="2025-12-05 00:44:45 +0000 UTC" firstStartedPulling="2025-12-05 00:44:46.379813069 +0000 UTC m=+1305.595474019" lastFinishedPulling="2025-12-05 00:44:47.50324858 +0000 UTC m=+1306.718909540" observedRunningTime="2025-12-05 00:44:48.327793253 +0000 UTC m=+1307.543454203" watchObservedRunningTime="2025-12-05 00:44:49.186576777 +0000 UTC m=+1308.402237727" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.318348 4759 generic.go:334] "Generic (PLEG): container finished" podID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerID="beaf554dd9ed722fdef00beb81326a1f73be3c578c893d0dc8a1084ae8d3dcd0" exitCode=0 Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.318443 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerDied","Data":"beaf554dd9ed722fdef00beb81326a1f73be3c578c893d0dc8a1084ae8d3dcd0"} Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.736497 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:44:49 crc kubenswrapper[4759]: E1205 00:44:49.737193 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="dnsmasq-dns" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.737210 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="dnsmasq-dns" Dec 05 00:44:49 crc kubenswrapper[4759]: E1205 00:44:49.737221 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="init" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.737227 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="init" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.737428 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be9cea9-8244-4151-8379-0195909a1399" containerName="dnsmasq-dns" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.738452 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.763932 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.855217 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.855330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4st5k\" (UniqueName: \"kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.855362 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.855406 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.855429 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.957334 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4st5k\" (UniqueName: \"kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.957399 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.957450 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.957475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.957567 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.958534 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.958667 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.959077 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.963785 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:49 crc kubenswrapper[4759]: I1205 00:44:49.995114 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4st5k\" (UniqueName: \"kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k\") pod \"dnsmasq-dns-b8fbc5445-hxz4t\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.060864 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.332241 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1013063a-9cdb-47ba-8c7d-5161bbbad9d4","Type":"ContainerStarted","Data":"9edbc93ff1f632bf32cafec529567dc118ae4c29a7ac128a1935d58d48803ae4"} Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.627015 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.865758 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.881595 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.887260 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-96hbt" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.887497 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.887702 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.887833 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.889792 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.912761 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wt66w"] Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.913945 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.923122 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.923385 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.923503 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.946420 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wt66w"] Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.983330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-cache\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.983662 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.983765 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.984060 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8j7q\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-kube-api-access-w8j7q\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:50 crc kubenswrapper[4759]: I1205 00:44:50.984289 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-lock\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.086743 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087151 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087330 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-cache\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087552 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087688 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087858 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.087973 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8j7q\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-kube-api-access-w8j7q\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088075 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-cache\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.087725 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.088116 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088296 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088433 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088562 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzzv\" (UniqueName: \"kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088741 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088776 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.088824 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-lock\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.089170 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/28edaf49-80c4-4732-a19f-1f2348fcd8e7-lock\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.089234 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:44:51.589217609 +0000 UTC m=+1310.804878559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.115653 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8j7q\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-kube-api-access-w8j7q\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.117560 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190696 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190760 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190803 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzzv\" (UniqueName: \"kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190851 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.190866 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.191274 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.191828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.192109 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.192406 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.194980 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.197373 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.199748 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.210758 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzzv\" (UniqueName: \"kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv\") pod \"swift-ring-rebalance-wt66w\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.319356 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.339878 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb3095df-1b95-485e-99b5-6a3886c58ac3","Type":"ContainerStarted","Data":"63fe4659e5d7afc4a784447a9073c9df003ad03dfbf367fd5338e46d294c1e44"} Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.341925 4759 generic.go:334] "Generic (PLEG): container finished" podID="e332e191-2628-48d0-be38-886992343bc8" containerID="8c3bda245e3836d5b2462b42288f424a4f366c08d81f617136fd39a29456ee34" exitCode=0 Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.341958 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" event={"ID":"e332e191-2628-48d0-be38-886992343bc8","Type":"ContainerDied","Data":"8c3bda245e3836d5b2462b42288f424a4f366c08d81f617136fd39a29456ee34"} Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.341979 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" event={"ID":"e332e191-2628-48d0-be38-886992343bc8","Type":"ContainerStarted","Data":"d866018c16a540a0fcc280210c22431c2d2f4b0b3a2e221a52cba049254479d5"} Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.601380 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.601538 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.601753 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: E1205 00:44:51.601802 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:44:52.601787573 +0000 UTC m=+1311.817448523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:44:51 crc kubenswrapper[4759]: I1205 00:44:51.900847 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wt66w"] Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.355324 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" event={"ID":"e332e191-2628-48d0-be38-886992343bc8","Type":"ContainerStarted","Data":"ef87ecc5174814c63cc0de21f385d1e30f8344eaebfd0ba1ca98d409f0497e20"} Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.355416 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.357640 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fdfe467-6f94-4139-b67a-73d6b69e7753","Type":"ContainerStarted","Data":"90b06ff3bb1e6b7b8ead1354dc89e8b047dfe53dd2df7c4e64a075f5542271ad"} Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.358072 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.360086 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wt66w" event={"ID":"12e24711-58db-434b-97ed-5db25d183784","Type":"ContainerStarted","Data":"9980b2010ac1c9b5d11fd662848bb8eb5167cf33c063a5ef9401678af99f519b"} Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.387286 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podStartSLOduration=3.387270617 podStartE2EDuration="3.387270617s" podCreationTimestamp="2025-12-05 00:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:44:52.378755189 +0000 UTC m=+1311.594416139" watchObservedRunningTime="2025-12-05 00:44:52.387270617 +0000 UTC m=+1311.602931567" Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.398976 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.193739593 podStartE2EDuration="43.398959815s" podCreationTimestamp="2025-12-05 00:44:09 +0000 UTC" firstStartedPulling="2025-12-05 00:44:21.360852155 +0000 UTC m=+1280.576513105" lastFinishedPulling="2025-12-05 00:44:51.566072377 +0000 UTC m=+1310.781733327" observedRunningTime="2025-12-05 00:44:52.391778519 +0000 UTC m=+1311.607439469" watchObservedRunningTime="2025-12-05 00:44:52.398959815 +0000 UTC m=+1311.614620765" Dec 05 00:44:52 crc kubenswrapper[4759]: I1205 00:44:52.646118 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:52 crc kubenswrapper[4759]: E1205 00:44:52.646543 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:44:52 crc kubenswrapper[4759]: E1205 00:44:52.646563 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:44:52 crc kubenswrapper[4759]: E1205 00:44:52.646617 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:44:54.646599634 +0000 UTC m=+1313.862260584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:44:54 crc kubenswrapper[4759]: I1205 00:44:54.376589 4759 generic.go:334] "Generic (PLEG): container finished" podID="1013063a-9cdb-47ba-8c7d-5161bbbad9d4" containerID="9edbc93ff1f632bf32cafec529567dc118ae4c29a7ac128a1935d58d48803ae4" exitCode=0 Dec 05 00:44:54 crc kubenswrapper[4759]: I1205 00:44:54.376808 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1013063a-9cdb-47ba-8c7d-5161bbbad9d4","Type":"ContainerDied","Data":"9edbc93ff1f632bf32cafec529567dc118ae4c29a7ac128a1935d58d48803ae4"} Dec 05 00:44:54 crc kubenswrapper[4759]: I1205 00:44:54.378762 4759 generic.go:334] "Generic (PLEG): container finished" podID="cb3095df-1b95-485e-99b5-6a3886c58ac3" containerID="63fe4659e5d7afc4a784447a9073c9df003ad03dfbf367fd5338e46d294c1e44" exitCode=0 Dec 05 00:44:54 crc kubenswrapper[4759]: I1205 00:44:54.378790 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb3095df-1b95-485e-99b5-6a3886c58ac3","Type":"ContainerDied","Data":"63fe4659e5d7afc4a784447a9073c9df003ad03dfbf367fd5338e46d294c1e44"} Dec 05 00:44:54 crc kubenswrapper[4759]: I1205 00:44:54.685037 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:54 crc kubenswrapper[4759]: E1205 00:44:54.685375 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:44:54 crc kubenswrapper[4759]: E1205 00:44:54.685650 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:44:54 crc kubenswrapper[4759]: E1205 00:44:54.685715 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:44:58.685694887 +0000 UTC m=+1317.901355837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:44:56 crc kubenswrapper[4759]: I1205 00:44:56.140815 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7fb5bf68f5-jznkg" podUID="220b0801-31e1-4193-9b01-30a191741f12" containerName="console" containerID="cri-o://3a8656dfb732f941f029996bc734a318c98e85bacf407593445f85cb8b644bf4" gracePeriod=15 Dec 05 00:44:56 crc kubenswrapper[4759]: I1205 00:44:56.408667 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fb5bf68f5-jznkg_220b0801-31e1-4193-9b01-30a191741f12/console/0.log" Dec 05 00:44:56 crc kubenswrapper[4759]: I1205 00:44:56.408720 4759 generic.go:334] "Generic (PLEG): container finished" podID="220b0801-31e1-4193-9b01-30a191741f12" containerID="3a8656dfb732f941f029996bc734a318c98e85bacf407593445f85cb8b644bf4" exitCode=2 Dec 05 00:44:56 crc kubenswrapper[4759]: I1205 00:44:56.408749 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fb5bf68f5-jznkg" event={"ID":"220b0801-31e1-4193-9b01-30a191741f12","Type":"ContainerDied","Data":"3a8656dfb732f941f029996bc734a318c98e85bacf407593445f85cb8b644bf4"} Dec 05 00:44:58 crc kubenswrapper[4759]: I1205 00:44:58.770276 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:44:58 crc kubenswrapper[4759]: E1205 00:44:58.770640 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:44:58 crc kubenswrapper[4759]: E1205 00:44:58.772140 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:44:58 crc kubenswrapper[4759]: E1205 00:44:58.772410 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:45:06.772383239 +0000 UTC m=+1325.988044199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:44:59 crc kubenswrapper[4759]: I1205 00:44:59.334169 4759 patch_prober.go:28] interesting pod/console-7fb5bf68f5-jznkg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.80:8443/health\": dial tcp 10.217.0.80:8443: connect: connection refused" start-of-body= Dec 05 00:44:59 crc kubenswrapper[4759]: I1205 00:44:59.334225 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-7fb5bf68f5-jznkg" podUID="220b0801-31e1-4193-9b01-30a191741f12" containerName="console" probeResult="failure" output="Get \"https://10.217.0.80:8443/health\": dial tcp 10.217.0.80:8443: connect: connection refused" Dec 05 00:44:59 crc kubenswrapper[4759]: I1205 00:44:59.775157 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.062771 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.123427 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.123655 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-qn796" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="dnsmasq-dns" containerID="cri-o://71de650ddeb45c3580ef7d26d1b67cdd7f6f9cbf5f77ad1613800b9402ad91d5" gracePeriod=10 Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.193595 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl"] Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.227001 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl"] Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.227123 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.230933 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.231347 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.309355 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwqt6\" (UniqueName: \"kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.309430 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.309491 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.411433 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwqt6\" (UniqueName: \"kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.411523 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.411606 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.413548 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.432203 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.453378 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwqt6\" (UniqueName: \"kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6\") pod \"collect-profiles-29414925-qqvbl\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.549491 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:00 crc kubenswrapper[4759]: I1205 00:45:00.952117 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 00:45:01 crc kubenswrapper[4759]: I1205 00:45:01.467009 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerID="71de650ddeb45c3580ef7d26d1b67cdd7f6f9cbf5f77ad1613800b9402ad91d5" exitCode=0 Dec 05 00:45:01 crc kubenswrapper[4759]: I1205 00:45:01.467075 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-qn796" event={"ID":"8a5ca45a-c445-4e90-a581-f16d0c3654fd","Type":"ContainerDied","Data":"71de650ddeb45c3580ef7d26d1b67cdd7f6f9cbf5f77ad1613800b9402ad91d5"} Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.757986 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fb5bf68f5-jznkg_220b0801-31e1-4193-9b01-30a191741f12/console/0.log" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.758442 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.768723 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.768810 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.768867 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6mh6\" (UniqueName: \"kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.768913 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.769018 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.769061 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.769101 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config\") pod \"220b0801-31e1-4193-9b01-30a191741f12\" (UID: \"220b0801-31e1-4193-9b01-30a191741f12\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.770736 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config" (OuterVolumeSpecName: "console-config") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.772080 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.772630 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca" (OuterVolumeSpecName: "service-ca") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.772950 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.787109 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.787593 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.788549 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6" (OuterVolumeSpecName: "kube-api-access-c6mh6") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "kube-api-access-c6mh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.790396 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "220b0801-31e1-4193-9b01-30a191741f12" (UID: "220b0801-31e1-4193-9b01-30a191741f12"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.870527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb\") pod \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.870605 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb\") pod \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.870714 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc\") pod \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.870826 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config\") pod \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.870878 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2mh4\" (UniqueName: \"kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4\") pod \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\" (UID: \"8a5ca45a-c445-4e90-a581-f16d0c3654fd\") " Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871344 4759 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871371 4759 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871383 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6mh6\" (UniqueName: \"kubernetes.io/projected/220b0801-31e1-4193-9b01-30a191741f12-kube-api-access-c6mh6\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871394 4759 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/220b0801-31e1-4193-9b01-30a191741f12-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871402 4759 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871410 4759 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.871418 4759 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/220b0801-31e1-4193-9b01-30a191741f12-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.877719 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4" (OuterVolumeSpecName: "kube-api-access-b2mh4") pod "8a5ca45a-c445-4e90-a581-f16d0c3654fd" (UID: "8a5ca45a-c445-4e90-a581-f16d0c3654fd"). InnerVolumeSpecName "kube-api-access-b2mh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.973337 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2mh4\" (UniqueName: \"kubernetes.io/projected/8a5ca45a-c445-4e90-a581-f16d0c3654fd-kube-api-access-b2mh4\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:02 crc kubenswrapper[4759]: I1205 00:45:02.999013 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a5ca45a-c445-4e90-a581-f16d0c3654fd" (UID: "8a5ca45a-c445-4e90-a581-f16d0c3654fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.012903 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a5ca45a-c445-4e90-a581-f16d0c3654fd" (UID: "8a5ca45a-c445-4e90-a581-f16d0c3654fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.015770 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a5ca45a-c445-4e90-a581-f16d0c3654fd" (UID: "8a5ca45a-c445-4e90-a581-f16d0c3654fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.028009 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config" (OuterVolumeSpecName: "config") pod "8a5ca45a-c445-4e90-a581-f16d0c3654fd" (UID: "8a5ca45a-c445-4e90-a581-f16d0c3654fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.075206 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.075445 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.075517 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.075580 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5ca45a-c445-4e90-a581-f16d0c3654fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.116026 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl"] Dec 05 00:45:03 crc kubenswrapper[4759]: W1205 00:45:03.119258 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294c868c_f73e_4ef5_b9ac_359751888700.slice/crio-1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee WatchSource:0}: Error finding container 1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee: Status 404 returned error can't find the container with id 1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.491643 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1013063a-9cdb-47ba-8c7d-5161bbbad9d4","Type":"ContainerStarted","Data":"55f50e180356d458e9eadec6ed03c5a8e98026ae03964e6605ccb7d9693e88a4"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.494533 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb3095df-1b95-485e-99b5-6a3886c58ac3","Type":"ContainerStarted","Data":"ecf37324d31a7c4a875c54b0a4cde373e34af59702295bdc0064f12cb402e8b0"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.498043 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-qn796" event={"ID":"8a5ca45a-c445-4e90-a581-f16d0c3654fd","Type":"ContainerDied","Data":"3dac38c83da8fcd92e2fcb977cf600606e8990a8e659bf6c774f0b2170ef2e28"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.498086 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-qn796" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.498092 4759 scope.go:117] "RemoveContainer" containerID="71de650ddeb45c3580ef7d26d1b67cdd7f6f9cbf5f77ad1613800b9402ad91d5" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.500780 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" event={"ID":"294c868c-f73e-4ef5-b9ac-359751888700","Type":"ContainerStarted","Data":"c8d12ff814c7fe7114b6519df703ddefdb3289797ec33d4d7596113349a4d821"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.500849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" event={"ID":"294c868c-f73e-4ef5-b9ac-359751888700","Type":"ContainerStarted","Data":"1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.504625 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wt66w" event={"ID":"12e24711-58db-434b-97ed-5db25d183784","Type":"ContainerStarted","Data":"70238cad41e44f645a6d54ef9a528c262258a99b72c686a4182aa6d188c502b1"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.509996 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fb5bf68f5-jznkg_220b0801-31e1-4193-9b01-30a191741f12/console/0.log" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.510103 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fb5bf68f5-jznkg" event={"ID":"220b0801-31e1-4193-9b01-30a191741f12","Type":"ContainerDied","Data":"0a7998bd64b4a8dccf01ce3b278a4361fdc42d1c917d8a3c88f696f72281cb5e"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.510202 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fb5bf68f5-jznkg" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.526420 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerStarted","Data":"19bd0bc5a80f37d207a3d873f7b7c86098312bc33d67181f4d441a6568ba051c"} Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.528605 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.31661905 podStartE2EDuration="59.528584399s" podCreationTimestamp="2025-12-05 00:44:04 +0000 UTC" firstStartedPulling="2025-12-05 00:44:20.66467875 +0000 UTC m=+1279.880339710" lastFinishedPulling="2025-12-05 00:44:49.876644109 +0000 UTC m=+1309.092305059" observedRunningTime="2025-12-05 00:45:03.525839712 +0000 UTC m=+1322.741500662" watchObservedRunningTime="2025-12-05 00:45:03.528584399 +0000 UTC m=+1322.744245349" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.543383 4759 scope.go:117] "RemoveContainer" containerID="9b7c0a04fa2c95300acbe598247bef174076297f4474c50acbc22ec8002beafd" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.567520 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wt66w" podStartSLOduration=2.851368312 podStartE2EDuration="13.567495444s" podCreationTimestamp="2025-12-05 00:44:50 +0000 UTC" firstStartedPulling="2025-12-05 00:44:51.925622634 +0000 UTC m=+1311.141283584" lastFinishedPulling="2025-12-05 00:45:02.641749776 +0000 UTC m=+1321.857410716" observedRunningTime="2025-12-05 00:45:03.55064785 +0000 UTC m=+1322.766308790" watchObservedRunningTime="2025-12-05 00:45:03.567495444 +0000 UTC m=+1322.783156394" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.618393 4759 scope.go:117] "RemoveContainer" containerID="3a8656dfb732f941f029996bc734a318c98e85bacf407593445f85cb8b644bf4" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.626129 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371979.228672 podStartE2EDuration="57.626104363s" podCreationTimestamp="2025-12-05 00:44:06 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.152940143 +0000 UTC m=+1281.368601093" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:03.610717735 +0000 UTC m=+1322.826378685" watchObservedRunningTime="2025-12-05 00:45:03.626104363 +0000 UTC m=+1322.841765313" Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.660924 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.674541 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7fb5bf68f5-jznkg"] Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.681070 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:45:03 crc kubenswrapper[4759]: I1205 00:45:03.690533 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-qn796"] Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.433880 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.434271 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.434344 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.435561 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.435672 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae" gracePeriod=600 Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.539634 4759 generic.go:334] "Generic (PLEG): container finished" podID="294c868c-f73e-4ef5-b9ac-359751888700" containerID="c8d12ff814c7fe7114b6519df703ddefdb3289797ec33d4d7596113349a4d821" exitCode=0 Dec 05 00:45:04 crc kubenswrapper[4759]: I1205 00:45:04.539701 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" event={"ID":"294c868c-f73e-4ef5-b9ac-359751888700","Type":"ContainerDied","Data":"c8d12ff814c7fe7114b6519df703ddefdb3289797ec33d4d7596113349a4d821"} Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.079613 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.166239 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220b0801-31e1-4193-9b01-30a191741f12" path="/var/lib/kubelet/pods/220b0801-31e1-4193-9b01-30a191741f12/volumes" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.167242 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" path="/var/lib/kubelet/pods/8a5ca45a-c445-4e90-a581-f16d0c3654fd/volumes" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.215108 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume\") pod \"294c868c-f73e-4ef5-b9ac-359751888700\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.215462 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume\") pod \"294c868c-f73e-4ef5-b9ac-359751888700\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.215686 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwqt6\" (UniqueName: \"kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6\") pod \"294c868c-f73e-4ef5-b9ac-359751888700\" (UID: \"294c868c-f73e-4ef5-b9ac-359751888700\") " Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.216766 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume" (OuterVolumeSpecName: "config-volume") pod "294c868c-f73e-4ef5-b9ac-359751888700" (UID: "294c868c-f73e-4ef5-b9ac-359751888700"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.318575 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294c868c-f73e-4ef5-b9ac-359751888700-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.442733 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6" (OuterVolumeSpecName: "kube-api-access-gwqt6") pod "294c868c-f73e-4ef5-b9ac-359751888700" (UID: "294c868c-f73e-4ef5-b9ac-359751888700"). InnerVolumeSpecName "kube-api-access-gwqt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.443012 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "294c868c-f73e-4ef5-b9ac-359751888700" (UID: "294c868c-f73e-4ef5-b9ac-359751888700"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.521529 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwqt6\" (UniqueName: \"kubernetes.io/projected/294c868c-f73e-4ef5-b9ac-359751888700-kube-api-access-gwqt6\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.521569 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294c868c-f73e-4ef5-b9ac-359751888700-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.584767 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" event={"ID":"294c868c-f73e-4ef5-b9ac-359751888700","Type":"ContainerDied","Data":"1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee"} Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.584824 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1805b196d9f75deda8cb6f6642b6d00b40ca358398f483dca51d5cddc02d0eee" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.584887 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.594500 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae" exitCode=0 Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.594592 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae"} Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.594649 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1"} Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.594689 4759 scope.go:117] "RemoveContainer" containerID="608c9e503141cdce7db4aff57a9358edcfc9dc62d8dc9293f4521db2be6238ce" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.945880 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 00:45:05 crc kubenswrapper[4759]: I1205 00:45:05.946458 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 00:45:06 crc kubenswrapper[4759]: I1205 00:45:06.225518 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-qn796" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Dec 05 00:45:06 crc kubenswrapper[4759]: I1205 00:45:06.603535 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerStarted","Data":"a7109c4f65c82d137894f5bb0154f780f944846620a6e73eded15c6d36491d88"} Dec 05 00:45:06 crc kubenswrapper[4759]: I1205 00:45:06.846474 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:45:06 crc kubenswrapper[4759]: E1205 00:45:06.846894 4759 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 00:45:06 crc kubenswrapper[4759]: E1205 00:45:06.846969 4759 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 00:45:06 crc kubenswrapper[4759]: E1205 00:45:06.847086 4759 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift podName:28edaf49-80c4-4732-a19f-1f2348fcd8e7 nodeName:}" failed. No retries permitted until 2025-12-05 00:45:22.847045311 +0000 UTC m=+1342.062706301 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift") pod "swift-storage-0" (UID: "28edaf49-80c4-4732-a19f-1f2348fcd8e7") : configmap "swift-ring-files" not found Dec 05 00:45:07 crc kubenswrapper[4759]: I1205 00:45:07.430535 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 00:45:07 crc kubenswrapper[4759]: I1205 00:45:07.430954 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 00:45:08 crc kubenswrapper[4759]: I1205 00:45:08.111884 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 00:45:08 crc kubenswrapper[4759]: I1205 00:45:08.118612 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 00:45:08 crc kubenswrapper[4759]: I1205 00:45:08.217032 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 00:45:08 crc kubenswrapper[4759]: I1205 00:45:08.426959 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nw9xk" podUID="d4b47f07-88f8-4a9a-97ee-7c61be8a6235" containerName="ovn-controller" probeResult="failure" output=< Dec 05 00:45:08 crc kubenswrapper[4759]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 00:45:08 crc kubenswrapper[4759]: > Dec 05 00:45:08 crc kubenswrapper[4759]: I1205 00:45:08.719635 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.603528 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7cmgd"] Dec 05 00:45:09 crc kubenswrapper[4759]: E1205 00:45:09.604293 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="dnsmasq-dns" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604366 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="dnsmasq-dns" Dec 05 00:45:09 crc kubenswrapper[4759]: E1205 00:45:09.604383 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220b0801-31e1-4193-9b01-30a191741f12" containerName="console" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604390 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="220b0801-31e1-4193-9b01-30a191741f12" containerName="console" Dec 05 00:45:09 crc kubenswrapper[4759]: E1205 00:45:09.604405 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294c868c-f73e-4ef5-b9ac-359751888700" containerName="collect-profiles" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604412 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="294c868c-f73e-4ef5-b9ac-359751888700" containerName="collect-profiles" Dec 05 00:45:09 crc kubenswrapper[4759]: E1205 00:45:09.604433 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="init" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604439 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="init" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604663 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="220b0801-31e1-4193-9b01-30a191741f12" containerName="console" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604684 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="294c868c-f73e-4ef5-b9ac-359751888700" containerName="collect-profiles" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.604706 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5ca45a-c445-4e90-a581-f16d0c3654fd" containerName="dnsmasq-dns" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.605571 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.615237 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7cmgd"] Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.700891 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-08d7-account-create-update-hk8fj"] Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.703573 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.705375 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.705620 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9cd\" (UniqueName: \"kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.706231 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.711012 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-08d7-account-create-update-hk8fj"] Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.807405 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9cd\" (UniqueName: \"kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.807570 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.807633 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vshn\" (UniqueName: \"kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.807687 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.808494 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.835027 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9cd\" (UniqueName: \"kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd\") pod \"mysqld-exporter-openstack-db-create-7cmgd\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.909703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.909781 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vshn\" (UniqueName: \"kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.910887 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.926360 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:09 crc kubenswrapper[4759]: I1205 00:45:09.926757 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vshn\" (UniqueName: \"kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn\") pod \"mysqld-exporter-08d7-account-create-update-hk8fj\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.027161 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.415577 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7cmgd"] Dec 05 00:45:10 crc kubenswrapper[4759]: W1205 00:45:10.424099 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2308f19c_fca8_4c34_9324_c063a3f03433.slice/crio-2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954 WatchSource:0}: Error finding container 2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954: Status 404 returned error can't find the container with id 2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954 Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.550997 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-08d7-account-create-update-hk8fj"] Dec 05 00:45:10 crc kubenswrapper[4759]: W1205 00:45:10.553557 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6764ecf_06a5_4456_b58f_5bf5625e56f0.slice/crio-84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c WatchSource:0}: Error finding container 84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c: Status 404 returned error can't find the container with id 84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.644620 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" event={"ID":"2308f19c-fca8-4c34-9324-c063a3f03433","Type":"ContainerStarted","Data":"335748906aebd85fd2c5d81cc28c8d61f6bbc466f1d3dc489245408a6376e54f"} Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.644662 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" event={"ID":"2308f19c-fca8-4c34-9324-c063a3f03433","Type":"ContainerStarted","Data":"2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954"} Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.648671 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" event={"ID":"e6764ecf-06a5-4456-b58f-5bf5625e56f0","Type":"ContainerStarted","Data":"84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c"} Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.650799 4759 generic.go:334] "Generic (PLEG): container finished" podID="12e24711-58db-434b-97ed-5db25d183784" containerID="70238cad41e44f645a6d54ef9a528c262258a99b72c686a4182aa6d188c502b1" exitCode=0 Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.650829 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wt66w" event={"ID":"12e24711-58db-434b-97ed-5db25d183784","Type":"ContainerDied","Data":"70238cad41e44f645a6d54ef9a528c262258a99b72c686a4182aa6d188c502b1"} Dec 05 00:45:10 crc kubenswrapper[4759]: I1205 00:45:10.664218 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" podStartSLOduration=1.6641989860000002 podStartE2EDuration="1.664198986s" podCreationTimestamp="2025-12-05 00:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:10.660821963 +0000 UTC m=+1329.876482923" watchObservedRunningTime="2025-12-05 00:45:10.664198986 +0000 UTC m=+1329.879859936" Dec 05 00:45:11 crc kubenswrapper[4759]: I1205 00:45:11.665228 4759 generic.go:334] "Generic (PLEG): container finished" podID="e6764ecf-06a5-4456-b58f-5bf5625e56f0" containerID="22443be551834cc4a8f2603a6bd39976dc6754f4c846b3a7a540c8696c611040" exitCode=0 Dec 05 00:45:11 crc kubenswrapper[4759]: I1205 00:45:11.665621 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" event={"ID":"e6764ecf-06a5-4456-b58f-5bf5625e56f0","Type":"ContainerDied","Data":"22443be551834cc4a8f2603a6bd39976dc6754f4c846b3a7a540c8696c611040"} Dec 05 00:45:11 crc kubenswrapper[4759]: I1205 00:45:11.670442 4759 generic.go:334] "Generic (PLEG): container finished" podID="2308f19c-fca8-4c34-9324-c063a3f03433" containerID="335748906aebd85fd2c5d81cc28c8d61f6bbc466f1d3dc489245408a6376e54f" exitCode=0 Dec 05 00:45:11 crc kubenswrapper[4759]: I1205 00:45:11.670539 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" event={"ID":"2308f19c-fca8-4c34-9324-c063a3f03433","Type":"ContainerDied","Data":"335748906aebd85fd2c5d81cc28c8d61f6bbc466f1d3dc489245408a6376e54f"} Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.214015 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.250972 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251266 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251439 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251662 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzzv\" (UniqueName: \"kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251793 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251942 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.251982 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.252434 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf\") pod \"12e24711-58db-434b-97ed-5db25d183784\" (UID: \"12e24711-58db-434b-97ed-5db25d183784\") " Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.253201 4759 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.256339 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.259348 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.259379 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv" (OuterVolumeSpecName: "kube-api-access-lvzzv") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "kube-api-access-lvzzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.279266 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.281739 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts" (OuterVolumeSpecName: "scripts") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.288747 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "12e24711-58db-434b-97ed-5db25d183784" (UID: "12e24711-58db-434b-97ed-5db25d183784"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354619 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvzzv\" (UniqueName: \"kubernetes.io/projected/12e24711-58db-434b-97ed-5db25d183784-kube-api-access-lvzzv\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354650 4759 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12e24711-58db-434b-97ed-5db25d183784-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354660 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e24711-58db-434b-97ed-5db25d183784-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354670 4759 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354678 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.354686 4759 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12e24711-58db-434b-97ed-5db25d183784-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.686723 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerStarted","Data":"0c07116d4edac872b85967f4d4208c5c0b943a30727ba564b3e125536795746b"} Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.688645 4759 generic.go:334] "Generic (PLEG): container finished" podID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerID="0a18484c4c14989b6cd1995e3595bcffb8b46d6d995e6356081f751cb7815390" exitCode=0 Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.688687 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerDied","Data":"0a18484c4c14989b6cd1995e3595bcffb8b46d6d995e6356081f751cb7815390"} Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.695400 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wt66w" event={"ID":"12e24711-58db-434b-97ed-5db25d183784","Type":"ContainerDied","Data":"9980b2010ac1c9b5d11fd662848bb8eb5167cf33c063a5ef9401678af99f519b"} Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.695464 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9980b2010ac1c9b5d11fd662848bb8eb5167cf33c063a5ef9401678af99f519b" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.695534 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wt66w" Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.703695 4759 generic.go:334] "Generic (PLEG): container finished" podID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerID="60e223bb18ce845cab8ddf921004958f044d69a60364af0d0f26073750759ea4" exitCode=0 Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.703911 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerDied","Data":"60e223bb18ce845cab8ddf921004958f044d69a60364af0d0f26073750759ea4"} Dec 05 00:45:12 crc kubenswrapper[4759]: I1205 00:45:12.746426 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.008339847 podStartE2EDuration="1m3.746407556s" podCreationTimestamp="2025-12-05 00:44:09 +0000 UTC" firstStartedPulling="2025-12-05 00:44:21.448026781 +0000 UTC m=+1280.663687731" lastFinishedPulling="2025-12-05 00:45:12.18609447 +0000 UTC m=+1331.401755440" observedRunningTime="2025-12-05 00:45:12.7286543 +0000 UTC m=+1331.944315260" watchObservedRunningTime="2025-12-05 00:45:12.746407556 +0000 UTC m=+1331.962068516" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.268737 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.274627 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.383408 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9cd\" (UniqueName: \"kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd\") pod \"2308f19c-fca8-4c34-9324-c063a3f03433\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.383739 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts\") pod \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.383769 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts\") pod \"2308f19c-fca8-4c34-9324-c063a3f03433\" (UID: \"2308f19c-fca8-4c34-9324-c063a3f03433\") " Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.383810 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vshn\" (UniqueName: \"kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn\") pod \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\" (UID: \"e6764ecf-06a5-4456-b58f-5bf5625e56f0\") " Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.384157 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6764ecf-06a5-4456-b58f-5bf5625e56f0" (UID: "e6764ecf-06a5-4456-b58f-5bf5625e56f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.384598 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2308f19c-fca8-4c34-9324-c063a3f03433" (UID: "2308f19c-fca8-4c34-9324-c063a3f03433"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.400698 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd" (OuterVolumeSpecName: "kube-api-access-fz9cd") pod "2308f19c-fca8-4c34-9324-c063a3f03433" (UID: "2308f19c-fca8-4c34-9324-c063a3f03433"). InnerVolumeSpecName "kube-api-access-fz9cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.403455 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn" (OuterVolumeSpecName: "kube-api-access-2vshn") pod "e6764ecf-06a5-4456-b58f-5bf5625e56f0" (UID: "e6764ecf-06a5-4456-b58f-5bf5625e56f0"). InnerVolumeSpecName "kube-api-access-2vshn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.433888 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nw9xk" podUID="d4b47f07-88f8-4a9a-97ee-7c61be8a6235" containerName="ovn-controller" probeResult="failure" output=< Dec 05 00:45:13 crc kubenswrapper[4759]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 00:45:13 crc kubenswrapper[4759]: > Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.445154 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.447929 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nscp4" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.485351 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9cd\" (UniqueName: \"kubernetes.io/projected/2308f19c-fca8-4c34-9324-c063a3f03433-kube-api-access-fz9cd\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.485388 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6764ecf-06a5-4456-b58f-5bf5625e56f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.485398 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2308f19c-fca8-4c34-9324-c063a3f03433-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.485407 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vshn\" (UniqueName: \"kubernetes.io/projected/e6764ecf-06a5-4456-b58f-5bf5625e56f0-kube-api-access-2vshn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.659773 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nw9xk-config-gkldk"] Dec 05 00:45:13 crc kubenswrapper[4759]: E1205 00:45:13.660113 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e24711-58db-434b-97ed-5db25d183784" containerName="swift-ring-rebalance" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660130 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e24711-58db-434b-97ed-5db25d183784" containerName="swift-ring-rebalance" Dec 05 00:45:13 crc kubenswrapper[4759]: E1205 00:45:13.660141 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2308f19c-fca8-4c34-9324-c063a3f03433" containerName="mariadb-database-create" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660148 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2308f19c-fca8-4c34-9324-c063a3f03433" containerName="mariadb-database-create" Dec 05 00:45:13 crc kubenswrapper[4759]: E1205 00:45:13.660176 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6764ecf-06a5-4456-b58f-5bf5625e56f0" containerName="mariadb-account-create-update" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660182 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6764ecf-06a5-4456-b58f-5bf5625e56f0" containerName="mariadb-account-create-update" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660364 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6764ecf-06a5-4456-b58f-5bf5625e56f0" containerName="mariadb-account-create-update" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660384 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e24711-58db-434b-97ed-5db25d183784" containerName="swift-ring-rebalance" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660403 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2308f19c-fca8-4c34-9324-c063a3f03433" containerName="mariadb-database-create" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.660955 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.663099 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.689912 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk-config-gkldk"] Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.726943 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerStarted","Data":"4a743cc3904208a2b8acbf7150cd48744c98326fb8db6998d672e6baa70824e6"} Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.729614 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" event={"ID":"2308f19c-fca8-4c34-9324-c063a3f03433","Type":"ContainerDied","Data":"2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954"} Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.729639 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c67879145c567a75d0587f4ae118818f47447daad1c38293123338386a13954" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.729695 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7cmgd" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.747984 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerStarted","Data":"b511a66d1e004c0f268b8e6d9b0e5bc1a7b577592e9070b3ac0d5b290ddc8e97"} Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.748262 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.749586 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" event={"ID":"e6764ecf-06a5-4456-b58f-5bf5625e56f0","Type":"ContainerDied","Data":"84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c"} Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.749622 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b60fcf9862ef781e83f782553b8a43111d06c8f7198f7248ab0e16d352f49c" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.749687 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-08d7-account-create-update-hk8fj" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.776479 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.802504788 podStartE2EDuration="1m11.776460465s" podCreationTimestamp="2025-12-05 00:44:02 +0000 UTC" firstStartedPulling="2025-12-05 00:44:22.133250867 +0000 UTC m=+1281.348911817" lastFinishedPulling="2025-12-05 00:44:37.107206544 +0000 UTC m=+1296.322867494" observedRunningTime="2025-12-05 00:45:13.761881377 +0000 UTC m=+1332.977542327" watchObservedRunningTime="2025-12-05 00:45:13.776460465 +0000 UTC m=+1332.992121415" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.791787 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.791852 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2rtg\" (UniqueName: \"kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.791914 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.791957 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.791991 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.792041 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.797959 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.69817052 podStartE2EDuration="1m10.797937002s" podCreationTimestamp="2025-12-05 00:44:03 +0000 UTC" firstStartedPulling="2025-12-05 00:44:21.457116266 +0000 UTC m=+1280.672777216" lastFinishedPulling="2025-12-05 00:44:37.556882748 +0000 UTC m=+1296.772543698" observedRunningTime="2025-12-05 00:45:13.790122 +0000 UTC m=+1333.005782950" watchObservedRunningTime="2025-12-05 00:45:13.797937002 +0000 UTC m=+1333.013597952" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.893808 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2rtg\" (UniqueName: \"kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.893969 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894039 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894101 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894228 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894278 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894486 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894492 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.894533 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.895061 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.896041 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:13 crc kubenswrapper[4759]: I1205 00:45:13.917519 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2rtg\" (UniqueName: \"kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg\") pod \"ovn-controller-nw9xk-config-gkldk\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.011494 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.371765 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk-config-gkldk"] Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.541278 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.757918 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-gkldk" event={"ID":"d4d61f2f-170a-42e4-92a6-bd6afdce3a48","Type":"ContainerStarted","Data":"aa8da572cf07e326f227a0ee1325d03405affadfadde5b8c194e6003455db497"} Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.757991 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-gkldk" event={"ID":"d4d61f2f-170a-42e4-92a6-bd6afdce3a48","Type":"ContainerStarted","Data":"fe5b0761c365a167fc8727b62d487378904b712545521e0071d57708a09ce94f"} Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.902501 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nw9xk-config-gkldk" podStartSLOduration=1.9024829699999999 podStartE2EDuration="1.90248297s" podCreationTimestamp="2025-12-05 00:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:14.800034365 +0000 UTC m=+1334.015695315" watchObservedRunningTime="2025-12-05 00:45:14.90248297 +0000 UTC m=+1334.118143920" Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.914435 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv"] Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.915953 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:14 crc kubenswrapper[4759]: I1205 00:45:14.922354 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv"] Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.011634 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fwv\" (UniqueName: \"kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.011674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.113748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fwv\" (UniqueName: \"kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.114018 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.114642 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.145752 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fwv\" (UniqueName: \"kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv\") pod \"mysqld-exporter-openstack-cell1-db-create-xp7tv\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.166583 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-8fe5-account-create-update-bn9md"] Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.167580 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.170949 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-8fe5-account-create-update-bn9md"] Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.194435 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.249996 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.317479 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.317631 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9vc\" (UniqueName: \"kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.419366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.419631 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9vc\" (UniqueName: \"kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.421184 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.444695 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9vc\" (UniqueName: \"kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc\") pod \"mysqld-exporter-8fe5-account-create-update-bn9md\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.493531 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.767166 4759 generic.go:334] "Generic (PLEG): container finished" podID="d4d61f2f-170a-42e4-92a6-bd6afdce3a48" containerID="aa8da572cf07e326f227a0ee1325d03405affadfadde5b8c194e6003455db497" exitCode=0 Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.767199 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-gkldk" event={"ID":"d4d61f2f-170a-42e4-92a6-bd6afdce3a48","Type":"ContainerDied","Data":"aa8da572cf07e326f227a0ee1325d03405affadfadde5b8c194e6003455db497"} Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.782923 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv"] Dec 05 00:45:15 crc kubenswrapper[4759]: I1205 00:45:15.964828 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-8fe5-account-create-update-bn9md"] Dec 05 00:45:15 crc kubenswrapper[4759]: W1205 00:45:15.970151 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf7d1943_eee7_4d3c_bd3a_3d8e41f0fee7.slice/crio-0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81 WatchSource:0}: Error finding container 0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81: Status 404 returned error can't find the container with id 0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81 Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.074288 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.780582 4759 generic.go:334] "Generic (PLEG): container finished" podID="cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" containerID="a8259ea1d2606a0abbc15370e1783aa517587a3b2b1beba8885113c527dcc025" exitCode=0 Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.780774 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" event={"ID":"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7","Type":"ContainerDied","Data":"a8259ea1d2606a0abbc15370e1783aa517587a3b2b1beba8885113c527dcc025"} Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.781010 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" event={"ID":"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7","Type":"ContainerStarted","Data":"0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81"} Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.782911 4759 generic.go:334] "Generic (PLEG): container finished" podID="ca571ecf-ba87-47b0-acb6-a76d034f0b32" containerID="308e6c3fa50fd86f198fcbac21519d3c0e4239dbc90e6ea87b2e95d3ac1923e3" exitCode=0 Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.782969 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" event={"ID":"ca571ecf-ba87-47b0-acb6-a76d034f0b32","Type":"ContainerDied","Data":"308e6c3fa50fd86f198fcbac21519d3c0e4239dbc90e6ea87b2e95d3ac1923e3"} Dec 05 00:45:16 crc kubenswrapper[4759]: I1205 00:45:16.782998 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" event={"ID":"ca571ecf-ba87-47b0-acb6-a76d034f0b32","Type":"ContainerStarted","Data":"380f5e31f0fbf81542c476b9d6950327244571ab57080601515f2c55e2050f75"} Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.173665 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251255 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251352 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251379 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2rtg\" (UniqueName: \"kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251413 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251439 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251551 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn\") pod \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\" (UID: \"d4d61f2f-170a-42e4-92a6-bd6afdce3a48\") " Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.251958 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.252416 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run" (OuterVolumeSpecName: "var-run") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.253009 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.253086 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts" (OuterVolumeSpecName: "scripts") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.253620 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.260520 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg" (OuterVolumeSpecName: "kube-api-access-l2rtg") pod "d4d61f2f-170a-42e4-92a6-bd6afdce3a48" (UID: "d4d61f2f-170a-42e4-92a6-bd6afdce3a48"). InnerVolumeSpecName "kube-api-access-l2rtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353136 4759 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353168 4759 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353183 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353193 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2rtg\" (UniqueName: \"kubernetes.io/projected/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-kube-api-access-l2rtg\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353204 4759 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.353214 4759 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4d61f2f-170a-42e4-92a6-bd6afdce3a48-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.558009 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xshzj"] Dec 05 00:45:17 crc kubenswrapper[4759]: E1205 00:45:17.558439 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d61f2f-170a-42e4-92a6-bd6afdce3a48" containerName="ovn-config" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.558455 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d61f2f-170a-42e4-92a6-bd6afdce3a48" containerName="ovn-config" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.558628 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d61f2f-170a-42e4-92a6-bd6afdce3a48" containerName="ovn-config" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.559241 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.574108 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xshzj"] Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.658880 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.659215 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9ts\" (UniqueName: \"kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.678870 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-936e-account-create-update-69tpk"] Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.680046 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.682093 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.691534 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-936e-account-create-update-69tpk"] Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.760553 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.760616 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9ts\" (UniqueName: \"kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.760711 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.760751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqc8n\" (UniqueName: \"kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.761320 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.777649 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9ts\" (UniqueName: \"kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts\") pod \"keystone-db-create-xshzj\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.814439 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-gkldk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.816497 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-gkldk" event={"ID":"d4d61f2f-170a-42e4-92a6-bd6afdce3a48","Type":"ContainerDied","Data":"fe5b0761c365a167fc8727b62d487378904b712545521e0071d57708a09ce94f"} Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.816538 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe5b0761c365a167fc8727b62d487378904b712545521e0071d57708a09ce94f" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.862794 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.862888 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqc8n\" (UniqueName: \"kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.863788 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.876286 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.876491 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ftb6f"] Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.877667 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.885400 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqc8n\" (UniqueName: \"kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n\") pod \"keystone-936e-account-create-update-69tpk\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.897101 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ftb6f"] Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.964048 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:17 crc kubenswrapper[4759]: I1205 00:45:17.964114 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68wj\" (UniqueName: \"kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.005714 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.046873 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7dea-account-create-update-mjncr"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.048083 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.051574 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.070396 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.070504 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68wj\" (UniqueName: \"kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.071618 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.071931 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7dea-account-create-update-mjncr"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.122942 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68wj\" (UniqueName: \"kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj\") pod \"placement-db-create-ftb6f\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.174285 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnq6h\" (UniqueName: \"kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.174446 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.213227 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8wnf5"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.215013 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.221279 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8wnf5"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.276134 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.276642 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.276753 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4sw6\" (UniqueName: \"kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.276818 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnq6h\" (UniqueName: \"kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.276869 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.278331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.302828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnq6h\" (UniqueName: \"kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h\") pod \"placement-7dea-account-create-update-mjncr\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.332394 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-456f-account-create-update-2fx6n"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.333709 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.349137 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.406643 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.406852 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4sw6\" (UniqueName: \"kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.407843 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.408547 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.491656 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4sw6\" (UniqueName: \"kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6\") pod \"glance-db-create-8wnf5\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.507163 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-456f-account-create-update-2fx6n"] Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.508075 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.508148 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqrl\" (UniqueName: \"kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.543133 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:18 crc kubenswrapper[4759]: I1205 00:45:18.548229 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nw9xk" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.617603 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.617711 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqrl\" (UniqueName: \"kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.619206 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.655839 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqrl\" (UniqueName: \"kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl\") pod \"glance-456f-account-create-update-2fx6n\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.668378 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nw9xk-config-gkldk"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.678110 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nw9xk-config-gkldk"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.788892 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nw9xk-config-s45g9"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.790013 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.794544 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.804200 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk-config-s45g9"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.816734 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.824928 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xshzj"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.889214 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.893470 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922601 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922665 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922695 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922735 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922779 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:18.922828 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vtfq\" (UniqueName: \"kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.025905 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk9vc\" (UniqueName: \"kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc\") pod \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.026346 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts\") pod \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.026496 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4fwv\" (UniqueName: \"kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv\") pod \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\" (UID: \"ca571ecf-ba87-47b0-acb6-a76d034f0b32\") " Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.026926 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts\") pod \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\" (UID: \"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7\") " Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027374 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vtfq\" (UniqueName: \"kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027502 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" (UID: "cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027538 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027625 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027707 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027765 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027818 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.027965 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.028027 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.028070 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.029991 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.030350 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca571ecf-ba87-47b0-acb6-a76d034f0b32" (UID: "ca571ecf-ba87-47b0-acb6-a76d034f0b32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.030431 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.034172 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc" (OuterVolumeSpecName: "kube-api-access-pk9vc") pod "cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" (UID: "cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7"). InnerVolumeSpecName "kube-api-access-pk9vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.034210 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv" (OuterVolumeSpecName: "kube-api-access-w4fwv") pod "ca571ecf-ba87-47b0-acb6-a76d034f0b32" (UID: "ca571ecf-ba87-47b0-acb6-a76d034f0b32"). InnerVolumeSpecName "kube-api-access-w4fwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.045284 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vtfq\" (UniqueName: \"kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq\") pod \"ovn-controller-nw9xk-config-s45g9\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.109782 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.129388 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca571ecf-ba87-47b0-acb6-a76d034f0b32-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.129420 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4fwv\" (UniqueName: \"kubernetes.io/projected/ca571ecf-ba87-47b0-acb6-a76d034f0b32-kube-api-access-w4fwv\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.129430 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk9vc\" (UniqueName: \"kubernetes.io/projected/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7-kube-api-access-pk9vc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.166555 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d61f2f-170a-42e4-92a6-bd6afdce3a48" path="/var/lib/kubelet/pods/d4d61f2f-170a-42e4-92a6-bd6afdce3a48/volumes" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.832273 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.832267 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv" event={"ID":"ca571ecf-ba87-47b0-acb6-a76d034f0b32","Type":"ContainerDied","Data":"380f5e31f0fbf81542c476b9d6950327244571ab57080601515f2c55e2050f75"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.832751 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="380f5e31f0fbf81542c476b9d6950327244571ab57080601515f2c55e2050f75" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.833930 4759 generic.go:334] "Generic (PLEG): container finished" podID="ce7ddb84-b42d-4a8b-ae8c-a263a618b408" containerID="cacc93f2f921c3f2f226742855ac925546576dd67c848ae12403ed2aa2cc1f58" exitCode=0 Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.834043 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xshzj" event={"ID":"ce7ddb84-b42d-4a8b-ae8c-a263a618b408","Type":"ContainerDied","Data":"cacc93f2f921c3f2f226742855ac925546576dd67c848ae12403ed2aa2cc1f58"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.834091 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xshzj" event={"ID":"ce7ddb84-b42d-4a8b-ae8c-a263a618b408","Type":"ContainerStarted","Data":"72bd02a24a63331ec923c0f57ae91d96c26aefa3a6a18d863adde0206e1cff74"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.836072 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" event={"ID":"cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7","Type":"ContainerDied","Data":"0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.836094 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f568fdfc594a1445dfec4becd3f82b582790ce452702bf82fb40d0effd3bd81" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:19.836120 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-8fe5-account-create-update-bn9md" Dec 05 00:45:20 crc kubenswrapper[4759]: W1205 00:45:20.299083 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb21999b_a97e_4d2f_a9d1_c2f2b7049998.slice/crio-f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a WatchSource:0}: Error finding container f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a: Status 404 returned error can't find the container with id f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.299912 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7dea-account-create-update-mjncr"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.310111 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-936e-account-create-update-69tpk"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.511627 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-456f-account-create-update-2fx6n"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.526345 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ftb6f"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.539656 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8wnf5"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.564818 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw9xk-config-s45g9"] Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.847115 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-456f-account-create-update-2fx6n" event={"ID":"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8","Type":"ContainerStarted","Data":"78d4f07a56d424abbfcd627597f9237a64884cdcb70b081ea01e3bb50949420b"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.847410 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-456f-account-create-update-2fx6n" event={"ID":"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8","Type":"ContainerStarted","Data":"5a3b52a35229158d220f71ebdcace447aa9f166441c956c88a615f07a48ed00d"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.850458 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-936e-account-create-update-69tpk" event={"ID":"52600026-ecf0-4ee9-a517-e2e1005d3b5d","Type":"ContainerStarted","Data":"60c2aa04d7359cf944b73d19e7ce6bb16e766390f05563dde6e7fc590731357f"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.850491 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-936e-account-create-update-69tpk" event={"ID":"52600026-ecf0-4ee9-a517-e2e1005d3b5d","Type":"ContainerStarted","Data":"e1432730fc1ddb30e88ac803f241d696b1b2646451f782e0ff849919c5c03960"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.876409 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-s45g9" event={"ID":"5f563704-00ec-45f7-a74e-58fc124f4e59","Type":"ContainerStarted","Data":"5161e780ab223388160c778bc557d1402143a8890e91ad70998caf50d7c804be"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.884787 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dea-account-create-update-mjncr" event={"ID":"db21999b-a97e-4d2f-a9d1-c2f2b7049998","Type":"ContainerStarted","Data":"e1cbf0a8e4d70332f965890efdebd0853d650b56061d90f43e8f2ceeb1f27cc7"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.884830 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dea-account-create-update-mjncr" event={"ID":"db21999b-a97e-4d2f-a9d1-c2f2b7049998","Type":"ContainerStarted","Data":"f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.886557 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-456f-account-create-update-2fx6n" podStartSLOduration=2.886540415 podStartE2EDuration="2.886540415s" podCreationTimestamp="2025-12-05 00:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:20.875696209 +0000 UTC m=+1340.091357159" watchObservedRunningTime="2025-12-05 00:45:20.886540415 +0000 UTC m=+1340.102201365" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.897022 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wnf5" event={"ID":"ec67af89-9918-4e80-95e5-a90ed96c7c04","Type":"ContainerStarted","Data":"1b74aed02e69ae52123cba02929413afb2d445abacd4f197515eceae2bbf81de"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.897065 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wnf5" event={"ID":"ec67af89-9918-4e80-95e5-a90ed96c7c04","Type":"ContainerStarted","Data":"549eb9abe5800b470975f64d0d7ab093fac83ad85684fdc4c5919714f6134883"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.902858 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ftb6f" event={"ID":"55592703-6d06-487c-a4e4-823be631891e","Type":"ContainerStarted","Data":"f7047d6949ce3f8132b1d2ce7043864eb36118addfe08d0a3e6b08b2ad94a645"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.902884 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ftb6f" event={"ID":"55592703-6d06-487c-a4e4-823be631891e","Type":"ContainerStarted","Data":"a6b377e314c5cd91bf2e6a8bef4bfde2d6192db6019a3bce1ad23fae81072f1f"} Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.916128 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-936e-account-create-update-69tpk" podStartSLOduration=3.916104841 podStartE2EDuration="3.916104841s" podCreationTimestamp="2025-12-05 00:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:20.893508637 +0000 UTC m=+1340.109169587" watchObservedRunningTime="2025-12-05 00:45:20.916104841 +0000 UTC m=+1340.131765791" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.949979 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7dea-account-create-update-mjncr" podStartSLOduration=2.949955842 podStartE2EDuration="2.949955842s" podCreationTimestamp="2025-12-05 00:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:20.907337576 +0000 UTC m=+1340.122998526" watchObservedRunningTime="2025-12-05 00:45:20.949955842 +0000 UTC m=+1340.165616792" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.957240 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-ftb6f" podStartSLOduration=3.95722216 podStartE2EDuration="3.95722216s" podCreationTimestamp="2025-12-05 00:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:20.925633905 +0000 UTC m=+1340.141294855" watchObservedRunningTime="2025-12-05 00:45:20.95722216 +0000 UTC m=+1340.172883110" Dec 05 00:45:20 crc kubenswrapper[4759]: I1205 00:45:20.969340 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-8wnf5" podStartSLOduration=2.969325858 podStartE2EDuration="2.969325858s" podCreationTimestamp="2025-12-05 00:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:20.938374518 +0000 UTC m=+1340.154035468" watchObservedRunningTime="2025-12-05 00:45:20.969325858 +0000 UTC m=+1340.184986808" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.217914 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.369300 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts\") pod \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.369442 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c9ts\" (UniqueName: \"kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts\") pod \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\" (UID: \"ce7ddb84-b42d-4a8b-ae8c-a263a618b408\") " Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.369820 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce7ddb84-b42d-4a8b-ae8c-a263a618b408" (UID: "ce7ddb84-b42d-4a8b-ae8c-a263a618b408"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.377520 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts" (OuterVolumeSpecName: "kube-api-access-8c9ts") pod "ce7ddb84-b42d-4a8b-ae8c-a263a618b408" (UID: "ce7ddb84-b42d-4a8b-ae8c-a263a618b408"). InnerVolumeSpecName "kube-api-access-8c9ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.471441 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.471473 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c9ts\" (UniqueName: \"kubernetes.io/projected/ce7ddb84-b42d-4a8b-ae8c-a263a618b408-kube-api-access-8c9ts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.916472 4759 generic.go:334] "Generic (PLEG): container finished" podID="52600026-ecf0-4ee9-a517-e2e1005d3b5d" containerID="60c2aa04d7359cf944b73d19e7ce6bb16e766390f05563dde6e7fc590731357f" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.916614 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-936e-account-create-update-69tpk" event={"ID":"52600026-ecf0-4ee9-a517-e2e1005d3b5d","Type":"ContainerDied","Data":"60c2aa04d7359cf944b73d19e7ce6bb16e766390f05563dde6e7fc590731357f"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.918572 4759 generic.go:334] "Generic (PLEG): container finished" podID="5f563704-00ec-45f7-a74e-58fc124f4e59" containerID="49888a093825543ae1d84c4d182ad7b4b8480f8bc9ad71fa039380f503c1de8b" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.918858 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-s45g9" event={"ID":"5f563704-00ec-45f7-a74e-58fc124f4e59","Type":"ContainerDied","Data":"49888a093825543ae1d84c4d182ad7b4b8480f8bc9ad71fa039380f503c1de8b"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.920354 4759 generic.go:334] "Generic (PLEG): container finished" podID="db21999b-a97e-4d2f-a9d1-c2f2b7049998" containerID="e1cbf0a8e4d70332f965890efdebd0853d650b56061d90f43e8f2ceeb1f27cc7" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.920427 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dea-account-create-update-mjncr" event={"ID":"db21999b-a97e-4d2f-a9d1-c2f2b7049998","Type":"ContainerDied","Data":"e1cbf0a8e4d70332f965890efdebd0853d650b56061d90f43e8f2ceeb1f27cc7"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.921969 4759 generic.go:334] "Generic (PLEG): container finished" podID="55592703-6d06-487c-a4e4-823be631891e" containerID="f7047d6949ce3f8132b1d2ce7043864eb36118addfe08d0a3e6b08b2ad94a645" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.922034 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ftb6f" event={"ID":"55592703-6d06-487c-a4e4-823be631891e","Type":"ContainerDied","Data":"f7047d6949ce3f8132b1d2ce7043864eb36118addfe08d0a3e6b08b2ad94a645"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.925391 4759 generic.go:334] "Generic (PLEG): container finished" podID="ec67af89-9918-4e80-95e5-a90ed96c7c04" containerID="1b74aed02e69ae52123cba02929413afb2d445abacd4f197515eceae2bbf81de" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.925556 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wnf5" event={"ID":"ec67af89-9918-4e80-95e5-a90ed96c7c04","Type":"ContainerDied","Data":"1b74aed02e69ae52123cba02929413afb2d445abacd4f197515eceae2bbf81de"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.931512 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xshzj" event={"ID":"ce7ddb84-b42d-4a8b-ae8c-a263a618b408","Type":"ContainerDied","Data":"72bd02a24a63331ec923c0f57ae91d96c26aefa3a6a18d863adde0206e1cff74"} Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.931562 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72bd02a24a63331ec923c0f57ae91d96c26aefa3a6a18d863adde0206e1cff74" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.931534 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xshzj" Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.935583 4759 generic.go:334] "Generic (PLEG): container finished" podID="4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" containerID="78d4f07a56d424abbfcd627597f9237a64884cdcb70b081ea01e3bb50949420b" exitCode=0 Dec 05 00:45:21 crc kubenswrapper[4759]: I1205 00:45:21.935643 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-456f-account-create-update-2fx6n" event={"ID":"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8","Type":"ContainerDied","Data":"78d4f07a56d424abbfcd627597f9237a64884cdcb70b081ea01e3bb50949420b"} Dec 05 00:45:22 crc kubenswrapper[4759]: I1205 00:45:22.903797 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:45:22 crc kubenswrapper[4759]: I1205 00:45:22.918417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28edaf49-80c4-4732-a19f-1f2348fcd8e7-etc-swift\") pod \"swift-storage-0\" (UID: \"28edaf49-80c4-4732-a19f-1f2348fcd8e7\") " pod="openstack/swift-storage-0" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.112648 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.380412 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.495026 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.521672 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqqrl\" (UniqueName: \"kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl\") pod \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.521762 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts\") pod \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\" (UID: \"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.523089 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" (UID: "4f589ff0-e6f7-430a-85a0-4cbc410b0ff8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.534912 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl" (OuterVolumeSpecName: "kube-api-access-dqqrl") pod "4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" (UID: "4f589ff0-e6f7-430a-85a0-4cbc410b0ff8"). InnerVolumeSpecName "kube-api-access-dqqrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.549775 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.558479 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.587009 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.593810 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623134 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts\") pod \"ec67af89-9918-4e80-95e5-a90ed96c7c04\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623183 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts\") pod \"55592703-6d06-487c-a4e4-823be631891e\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623238 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts\") pod \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623265 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4sw6\" (UniqueName: \"kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6\") pod \"ec67af89-9918-4e80-95e5-a90ed96c7c04\" (UID: \"ec67af89-9918-4e80-95e5-a90ed96c7c04\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623286 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnq6h\" (UniqueName: \"kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h\") pod \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\" (UID: \"db21999b-a97e-4d2f-a9d1-c2f2b7049998\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623327 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k68wj\" (UniqueName: \"kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj\") pod \"55592703-6d06-487c-a4e4-823be631891e\" (UID: \"55592703-6d06-487c-a4e4-823be631891e\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623710 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqqrl\" (UniqueName: \"kubernetes.io/projected/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-kube-api-access-dqqrl\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623725 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.623811 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db21999b-a97e-4d2f-a9d1-c2f2b7049998" (UID: "db21999b-a97e-4d2f-a9d1-c2f2b7049998"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.624674 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55592703-6d06-487c-a4e4-823be631891e" (UID: "55592703-6d06-487c-a4e4-823be631891e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.625056 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec67af89-9918-4e80-95e5-a90ed96c7c04" (UID: "ec67af89-9918-4e80-95e5-a90ed96c7c04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.627124 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6" (OuterVolumeSpecName: "kube-api-access-q4sw6") pod "ec67af89-9918-4e80-95e5-a90ed96c7c04" (UID: "ec67af89-9918-4e80-95e5-a90ed96c7c04"). InnerVolumeSpecName "kube-api-access-q4sw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.627170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj" (OuterVolumeSpecName: "kube-api-access-k68wj") pod "55592703-6d06-487c-a4e4-823be631891e" (UID: "55592703-6d06-487c-a4e4-823be631891e"). InnerVolumeSpecName "kube-api-access-k68wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.627707 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h" (OuterVolumeSpecName: "kube-api-access-mnq6h") pod "db21999b-a97e-4d2f-a9d1-c2f2b7049998" (UID: "db21999b-a97e-4d2f-a9d1-c2f2b7049998"). InnerVolumeSpecName "kube-api-access-mnq6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724657 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724786 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724841 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724923 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vtfq\" (UniqueName: \"kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724954 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqc8n\" (UniqueName: \"kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n\") pod \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724999 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts\") pod \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\" (UID: \"52600026-ecf0-4ee9-a517-e2e1005d3b5d\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725062 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725106 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run\") pod \"5f563704-00ec-45f7-a74e-58fc124f4e59\" (UID: \"5f563704-00ec-45f7-a74e-58fc124f4e59\") " Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.724986 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725397 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725460 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run" (OuterVolumeSpecName: "var-run") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725716 4759 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725742 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec67af89-9918-4e80-95e5-a90ed96c7c04-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725757 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55592703-6d06-487c-a4e4-823be631891e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725770 4759 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725783 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db21999b-a97e-4d2f-a9d1-c2f2b7049998-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725796 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4sw6\" (UniqueName: \"kubernetes.io/projected/ec67af89-9918-4e80-95e5-a90ed96c7c04-kube-api-access-q4sw6\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725808 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnq6h\" (UniqueName: \"kubernetes.io/projected/db21999b-a97e-4d2f-a9d1-c2f2b7049998-kube-api-access-mnq6h\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725821 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k68wj\" (UniqueName: \"kubernetes.io/projected/55592703-6d06-487c-a4e4-823be631891e-kube-api-access-k68wj\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725832 4759 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f563704-00ec-45f7-a74e-58fc124f4e59-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.725929 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.726125 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts" (OuterVolumeSpecName: "scripts") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.726267 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52600026-ecf0-4ee9-a517-e2e1005d3b5d" (UID: "52600026-ecf0-4ee9-a517-e2e1005d3b5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.728405 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq" (OuterVolumeSpecName: "kube-api-access-4vtfq") pod "5f563704-00ec-45f7-a74e-58fc124f4e59" (UID: "5f563704-00ec-45f7-a74e-58fc124f4e59"). InnerVolumeSpecName "kube-api-access-4vtfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.729136 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n" (OuterVolumeSpecName: "kube-api-access-cqc8n") pod "52600026-ecf0-4ee9-a517-e2e1005d3b5d" (UID: "52600026-ecf0-4ee9-a517-e2e1005d3b5d"). InnerVolumeSpecName "kube-api-access-cqc8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.826953 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vtfq\" (UniqueName: \"kubernetes.io/projected/5f563704-00ec-45f7-a74e-58fc124f4e59-kube-api-access-4vtfq\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.826981 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqc8n\" (UniqueName: \"kubernetes.io/projected/52600026-ecf0-4ee9-a517-e2e1005d3b5d-kube-api-access-cqc8n\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.826993 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52600026-ecf0-4ee9-a517-e2e1005d3b5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.827001 4759 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.827011 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f563704-00ec-45f7-a74e-58fc124f4e59-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.957657 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw9xk-config-s45g9" event={"ID":"5f563704-00ec-45f7-a74e-58fc124f4e59","Type":"ContainerDied","Data":"5161e780ab223388160c778bc557d1402143a8890e91ad70998caf50d7c804be"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.957704 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5161e780ab223388160c778bc557d1402143a8890e91ad70998caf50d7c804be" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.957761 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw9xk-config-s45g9" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.967332 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dea-account-create-update-mjncr" event={"ID":"db21999b-a97e-4d2f-a9d1-c2f2b7049998","Type":"ContainerDied","Data":"f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.967365 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f960cd54518b4b90be40cdaf0bed6c814d47f6bf0270052c9a40650180a01f5a" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.967379 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dea-account-create-update-mjncr" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.969471 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wnf5" event={"ID":"ec67af89-9918-4e80-95e5-a90ed96c7c04","Type":"ContainerDied","Data":"549eb9abe5800b470975f64d0d7ab093fac83ad85684fdc4c5919714f6134883"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.969636 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wnf5" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.970741 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549eb9abe5800b470975f64d0d7ab093fac83ad85684fdc4c5919714f6134883" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.971295 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ftb6f" event={"ID":"55592703-6d06-487c-a4e4-823be631891e","Type":"ContainerDied","Data":"a6b377e314c5cd91bf2e6a8bef4bfde2d6192db6019a3bce1ad23fae81072f1f"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.971367 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6b377e314c5cd91bf2e6a8bef4bfde2d6192db6019a3bce1ad23fae81072f1f" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.971412 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ftb6f" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.973545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-456f-account-create-update-2fx6n" event={"ID":"4f589ff0-e6f7-430a-85a0-4cbc410b0ff8","Type":"ContainerDied","Data":"5a3b52a35229158d220f71ebdcace447aa9f166441c956c88a615f07a48ed00d"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.973574 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a3b52a35229158d220f71ebdcace447aa9f166441c956c88a615f07a48ed00d" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.973742 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-456f-account-create-update-2fx6n" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.975225 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-936e-account-create-update-69tpk" event={"ID":"52600026-ecf0-4ee9-a517-e2e1005d3b5d","Type":"ContainerDied","Data":"e1432730fc1ddb30e88ac803f241d696b1b2646451f782e0ff849919c5c03960"} Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.975251 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1432730fc1ddb30e88ac803f241d696b1b2646451f782e0ff849919c5c03960" Dec 05 00:45:23 crc kubenswrapper[4759]: I1205 00:45:23.975427 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-936e-account-create-update-69tpk" Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.070243 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.073939 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.544613 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.629369 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.861064 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nw9xk-config-s45g9"] Dec 05 00:45:24 crc kubenswrapper[4759]: I1205 00:45:24.882506 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nw9xk-config-s45g9"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.016819 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"05ebec3d162f20a2bcbf4bc06b2da68381085fd40c85ec4eab703e9f2e36e375"} Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.178688 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f563704-00ec-45f7-a74e-58fc124f4e59" path="/var/lib/kubelet/pods/5f563704-00ec-45f7-a74e-58fc124f4e59/volumes" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.206659 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5cmtn"] Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207073 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55592703-6d06-487c-a4e4-823be631891e" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207091 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="55592703-6d06-487c-a4e4-823be631891e" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207104 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db21999b-a97e-4d2f-a9d1-c2f2b7049998" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207111 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="db21999b-a97e-4d2f-a9d1-c2f2b7049998" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207122 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52600026-ecf0-4ee9-a517-e2e1005d3b5d" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207130 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="52600026-ecf0-4ee9-a517-e2e1005d3b5d" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207142 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7ddb84-b42d-4a8b-ae8c-a263a618b408" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207148 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7ddb84-b42d-4a8b-ae8c-a263a618b408" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207165 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207172 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207196 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca571ecf-ba87-47b0-acb6-a76d034f0b32" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207203 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca571ecf-ba87-47b0-acb6-a76d034f0b32" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207212 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec67af89-9918-4e80-95e5-a90ed96c7c04" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207219 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec67af89-9918-4e80-95e5-a90ed96c7c04" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207227 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f563704-00ec-45f7-a74e-58fc124f4e59" containerName="ovn-config" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207233 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f563704-00ec-45f7-a74e-58fc124f4e59" containerName="ovn-config" Dec 05 00:45:25 crc kubenswrapper[4759]: E1205 00:45:25.207246 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207252 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207423 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="52600026-ecf0-4ee9-a517-e2e1005d3b5d" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207437 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7ddb84-b42d-4a8b-ae8c-a263a618b408" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207463 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="db21999b-a97e-4d2f-a9d1-c2f2b7049998" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207473 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec67af89-9918-4e80-95e5-a90ed96c7c04" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207482 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="55592703-6d06-487c-a4e4-823be631891e" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207495 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207508 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f563704-00ec-45f7-a74e-58fc124f4e59" containerName="ovn-config" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207521 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" containerName="mariadb-account-create-update" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.207531 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca571ecf-ba87-47b0-acb6-a76d034f0b32" containerName="mariadb-database-create" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.208134 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.228754 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6349-account-create-update-7kk9v"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.229981 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.232039 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.240104 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5cmtn"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.253894 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6349-account-create-update-7kk9v"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.275941 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.276007 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.276036 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22l8\" (UniqueName: \"kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.276116 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlmg\" (UniqueName: \"kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.307433 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4vclp"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.308865 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.314611 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3781-account-create-update-2sknq"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.332291 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4vclp"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.332653 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.337921 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.338117 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.339713 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.341917 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.355378 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3781-account-create-update-2sknq"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.365695 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379513 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22l8\" (UniqueName: \"kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379552 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379606 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379647 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379684 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlmg\" (UniqueName: \"kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379729 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zsv\" (UniqueName: \"kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379758 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blp96\" (UniqueName: \"kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379802 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379832 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379849 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzxw\" (UniqueName: \"kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.379868 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.382481 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.383030 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.416158 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22l8\" (UniqueName: \"kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8\") pod \"barbican-6349-account-create-update-7kk9v\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.432912 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlmg\" (UniqueName: \"kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg\") pod \"cinder-db-create-5cmtn\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.483873 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.483925 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzxw\" (UniqueName: \"kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.483946 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.484028 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.484068 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.484131 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zsv\" (UniqueName: \"kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.484167 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blp96\" (UniqueName: \"kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.485675 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.486065 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.499551 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.500879 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.507347 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-bbe7-account-create-update-tmcgm"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.508565 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.516464 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zsv\" (UniqueName: \"kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv\") pod \"barbican-db-create-4vclp\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.517449 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blp96\" (UniqueName: \"kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96\") pod \"mysqld-exporter-0\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.518144 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.522226 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzxw\" (UniqueName: \"kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw\") pod \"cinder-3781-account-create-update-2sknq\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.526012 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.531589 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bbe7-account-create-update-tmcgm"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.552076 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.586072 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glp8m\" (UniqueName: \"kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.586199 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.600635 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2b80-account-create-update-8nxsz"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.608282 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.610280 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.611945 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2b80-account-create-update-8nxsz"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.644030 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.676628 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.687443 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glp8m\" (UniqueName: \"kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.687513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.687546 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.687594 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47rbn\" (UniqueName: \"kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.688646 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.697820 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.701260 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-njwfb"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.706930 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.712153 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glp8m\" (UniqueName: \"kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m\") pod \"heat-bbe7-account-create-update-tmcgm\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.732381 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-njwfb"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.767289 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rn7b5"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.768391 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789633 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789693 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789727 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789764 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47rbn\" (UniqueName: \"kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789812 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.789832 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xwj8\" (UniqueName: \"kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.790750 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.792937 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rn7b5"] Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.810734 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47rbn\" (UniqueName: \"kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn\") pod \"neutron-2b80-account-create-update-8nxsz\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.871342 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.894366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.894435 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.894489 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.894507 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xwj8\" (UniqueName: \"kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.895371 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.895805 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.925131 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xwj8\" (UniqueName: \"kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8\") pod \"heat-db-create-njwfb\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " pod="openstack/heat-db-create-njwfb" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.925179 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq\") pod \"neutron-db-create-rn7b5\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:25 crc kubenswrapper[4759]: I1205 00:45:25.930407 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:26 crc kubenswrapper[4759]: I1205 00:45:26.070779 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-njwfb" Dec 05 00:45:26 crc kubenswrapper[4759]: I1205 00:45:26.074979 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:26 crc kubenswrapper[4759]: I1205 00:45:26.078096 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:26 crc kubenswrapper[4759]: I1205 00:45:26.088459 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:26 crc kubenswrapper[4759]: I1205 00:45:26.989768 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-njwfb"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.001041 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bbe7-account-create-update-tmcgm"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.011158 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.097668 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-njwfb" event={"ID":"859f7f90-7f35-4c95-80fd-240e91834ff6","Type":"ContainerStarted","Data":"595cb69a0833ab0e747c2388f13fd961bb795ff3cc354cf491852ab7bf9e253d"} Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.112252 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"978cb6a45f0975035ded840585cefdebaf6de54c32d0061aed28d8a40e21f597"} Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.126526 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbe7-account-create-update-tmcgm" event={"ID":"5e1f06ab-529e-4253-bf17-1255b1226d3f","Type":"ContainerStarted","Data":"101f8569a98bc8fa5d1dbcc4c84b0f278e6b890f67814168073fecaee9802733"} Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.145566 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"242736a8-8641-4915-b5cb-e271ce361e3a","Type":"ContainerStarted","Data":"e35350a0a6712eb8e65936aa23aa68a1eff7c54742d2f82eeaa88d50473194b9"} Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.146334 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.249166 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rn7b5"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.282960 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5cmtn"] Dec 05 00:45:27 crc kubenswrapper[4759]: W1205 00:45:27.293425 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179ac88c_cac6_44a8_9fa5_54bed12d118c.slice/crio-cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7 WatchSource:0}: Error finding container cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7: Status 404 returned error can't find the container with id cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7 Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.321542 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4vclp"] Dec 05 00:45:27 crc kubenswrapper[4759]: W1205 00:45:27.468738 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod035a8e83_7e26_4eb5_939c_4c70a2c86d94.slice/crio-7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d WatchSource:0}: Error finding container 7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d: Status 404 returned error can't find the container with id 7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.587585 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2b80-account-create-update-8nxsz"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.607553 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6349-account-create-update-7kk9v"] Dec 05 00:45:27 crc kubenswrapper[4759]: I1205 00:45:27.661485 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3781-account-create-update-2sknq"] Dec 05 00:45:27 crc kubenswrapper[4759]: W1205 00:45:27.686607 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57dcf39f_2def_45a4_b9f7_b9138e9a1a64.slice/crio-9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9 WatchSource:0}: Error finding container 9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9: Status 404 returned error can't find the container with id 9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9 Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.178082 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5cmtn" event={"ID":"179ac88c-cac6-44a8-9fa5-54bed12d118c","Type":"ContainerStarted","Data":"e2e8556c5516f39afbf9a226c20463fba9ebafa9b825f57252bd865810f4efc4"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.178546 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5cmtn" event={"ID":"179ac88c-cac6-44a8-9fa5-54bed12d118c","Type":"ContainerStarted","Data":"cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.182999 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-njwfb" event={"ID":"859f7f90-7f35-4c95-80fd-240e91834ff6","Type":"ContainerStarted","Data":"39bc75bdc433c1b8bcf585254b2e7c77f08d3faed7e1304bd7a3336ce72874c6"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.185967 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"821c54e190f7dda791943963de31eb0e80337eaff50deadb578a77f06f9d7d8a"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.186021 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"2c23078c6d69e8b1af2499f648228eb646ffd26a786ea9852c2e19243abc6162"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.190739 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3781-account-create-update-2sknq" event={"ID":"beaae381-be72-48be-8a0e-eba21df779b7","Type":"ContainerStarted","Data":"01b646635b3d0a4e19632df6890606f34b62d1e02493ab54d95570d9c86c4775"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.194362 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbe7-account-create-update-tmcgm" event={"ID":"5e1f06ab-529e-4253-bf17-1255b1226d3f","Type":"ContainerStarted","Data":"01915c63b208afe0d18292a0df305d5656b0506de0829b0db9e0fee6823d9a82"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.195797 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6349-account-create-update-7kk9v" event={"ID":"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb","Type":"ContainerStarted","Data":"44b5f6fd941e695a776bab3a56e3ad0c38ff79b09d7d5f75c167804bff0fe8dc"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.197475 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2b80-account-create-update-8nxsz" event={"ID":"57dcf39f-2def-45a4-b9f7-b9138e9a1a64","Type":"ContainerStarted","Data":"9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.198850 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4vclp" event={"ID":"035a8e83-7e26-4eb5-939c-4c70a2c86d94","Type":"ContainerStarted","Data":"41d11559900b9c3cd45ffc9403c2465b3515377629e01c9ec3a82a728480b75e"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.198894 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4vclp" event={"ID":"035a8e83-7e26-4eb5-939c-4c70a2c86d94","Type":"ContainerStarted","Data":"7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.200572 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rn7b5" event={"ID":"4a213f68-4a29-4f0d-8f47-fb47a7ed1769","Type":"ContainerStarted","Data":"7c9059d078f95d08702fa82d53350199df4cc766991c8d164b379b9c60934964"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.200607 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rn7b5" event={"ID":"4a213f68-4a29-4f0d-8f47-fb47a7ed1769","Type":"ContainerStarted","Data":"aba877ca7b40f82d9bdf0905e5c7b6415b47d70ef85f7600ad458127e21da3f0"} Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.209272 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5cmtn" podStartSLOduration=3.209256296 podStartE2EDuration="3.209256296s" podCreationTimestamp="2025-12-05 00:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:28.206875527 +0000 UTC m=+1347.422536477" watchObservedRunningTime="2025-12-05 00:45:28.209256296 +0000 UTC m=+1347.424917246" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.230654 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-rn7b5" podStartSLOduration=3.23063559 podStartE2EDuration="3.23063559s" podCreationTimestamp="2025-12-05 00:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:28.222540191 +0000 UTC m=+1347.438201131" watchObservedRunningTime="2025-12-05 00:45:28.23063559 +0000 UTC m=+1347.446296540" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.333061 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-bbe7-account-create-update-tmcgm" podStartSLOduration=3.333044104 podStartE2EDuration="3.333044104s" podCreationTimestamp="2025-12-05 00:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:28.262612626 +0000 UTC m=+1347.478273576" watchObservedRunningTime="2025-12-05 00:45:28.333044104 +0000 UTC m=+1347.548705054" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.346221 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-4vclp" podStartSLOduration=3.346202768 podStartE2EDuration="3.346202768s" podCreationTimestamp="2025-12-05 00:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:28.301756867 +0000 UTC m=+1347.517417817" watchObservedRunningTime="2025-12-05 00:45:28.346202768 +0000 UTC m=+1347.561863718" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.471558 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-x67vg"] Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.472842 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.478933 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.479464 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x67vg"] Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.479523 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nl4ht" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.479553 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.479592 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.515884 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.515987 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.516028 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tkc\" (UniqueName: \"kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.583045 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hl5mz"] Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.584288 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.586071 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.586556 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7zchw" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.595438 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hl5mz"] Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.619809 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.619923 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.619967 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tkc\" (UniqueName: \"kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.629300 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.639372 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.642595 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tkc\" (UniqueName: \"kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc\") pod \"keystone-db-sync-x67vg\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.722229 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.722291 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.722470 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxbh\" (UniqueName: \"kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.722525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.823586 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.823655 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.823700 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.823782 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxbh\" (UniqueName: \"kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.827468 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.828798 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.829414 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.847193 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxbh\" (UniqueName: \"kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh\") pod \"glance-db-sync-hl5mz\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.913945 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:28 crc kubenswrapper[4759]: I1205 00:45:28.921607 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hl5mz" Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.238945 4759 generic.go:334] "Generic (PLEG): container finished" podID="5e1f06ab-529e-4253-bf17-1255b1226d3f" containerID="01915c63b208afe0d18292a0df305d5656b0506de0829b0db9e0fee6823d9a82" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.239238 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbe7-account-create-update-tmcgm" event={"ID":"5e1f06ab-529e-4253-bf17-1255b1226d3f","Type":"ContainerDied","Data":"01915c63b208afe0d18292a0df305d5656b0506de0829b0db9e0fee6823d9a82"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.276323 4759 generic.go:334] "Generic (PLEG): container finished" podID="4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" containerID="9de5ddb54008cf35e5d8f0a9131ea80a4c64aa12bf3aae314459c4f332dd5621" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.276388 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6349-account-create-update-7kk9v" event={"ID":"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb","Type":"ContainerDied","Data":"9de5ddb54008cf35e5d8f0a9131ea80a4c64aa12bf3aae314459c4f332dd5621"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.289811 4759 generic.go:334] "Generic (PLEG): container finished" podID="179ac88c-cac6-44a8-9fa5-54bed12d118c" containerID="e2e8556c5516f39afbf9a226c20463fba9ebafa9b825f57252bd865810f4efc4" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.289908 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5cmtn" event={"ID":"179ac88c-cac6-44a8-9fa5-54bed12d118c","Type":"ContainerDied","Data":"e2e8556c5516f39afbf9a226c20463fba9ebafa9b825f57252bd865810f4efc4"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.291503 4759 generic.go:334] "Generic (PLEG): container finished" podID="57dcf39f-2def-45a4-b9f7-b9138e9a1a64" containerID="316fd171e5af8ce8e40018636c8b73154191034b940cf911d03b705f255d9fe0" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.291564 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2b80-account-create-update-8nxsz" event={"ID":"57dcf39f-2def-45a4-b9f7-b9138e9a1a64","Type":"ContainerDied","Data":"316fd171e5af8ce8e40018636c8b73154191034b940cf911d03b705f255d9fe0"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.293463 4759 generic.go:334] "Generic (PLEG): container finished" podID="035a8e83-7e26-4eb5-939c-4c70a2c86d94" containerID="41d11559900b9c3cd45ffc9403c2465b3515377629e01c9ec3a82a728480b75e" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.293511 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4vclp" event={"ID":"035a8e83-7e26-4eb5-939c-4c70a2c86d94","Type":"ContainerDied","Data":"41d11559900b9c3cd45ffc9403c2465b3515377629e01c9ec3a82a728480b75e"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.322777 4759 generic.go:334] "Generic (PLEG): container finished" podID="859f7f90-7f35-4c95-80fd-240e91834ff6" containerID="39bc75bdc433c1b8bcf585254b2e7c77f08d3faed7e1304bd7a3336ce72874c6" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.322863 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-njwfb" event={"ID":"859f7f90-7f35-4c95-80fd-240e91834ff6","Type":"ContainerDied","Data":"39bc75bdc433c1b8bcf585254b2e7c77f08d3faed7e1304bd7a3336ce72874c6"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.350578 4759 generic.go:334] "Generic (PLEG): container finished" podID="4a213f68-4a29-4f0d-8f47-fb47a7ed1769" containerID="7c9059d078f95d08702fa82d53350199df4cc766991c8d164b379b9c60934964" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.350642 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rn7b5" event={"ID":"4a213f68-4a29-4f0d-8f47-fb47a7ed1769","Type":"ContainerDied","Data":"7c9059d078f95d08702fa82d53350199df4cc766991c8d164b379b9c60934964"} Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.356702 4759 generic.go:334] "Generic (PLEG): container finished" podID="beaae381-be72-48be-8a0e-eba21df779b7" containerID="5147e1c6fcf02724f81e0b312ec97933269738613fe307280363a4e04a8e31d1" exitCode=0 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.356753 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3781-account-create-update-2sknq" event={"ID":"beaae381-be72-48be-8a0e-eba21df779b7","Type":"ContainerDied","Data":"5147e1c6fcf02724f81e0b312ec97933269738613fe307280363a4e04a8e31d1"} Dec 05 00:45:29 crc kubenswrapper[4759]: W1205 00:45:29.881563 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod538f07d4_2a1a_47e9_aec3_161f7b23af6f.slice/crio-239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7 WatchSource:0}: Error finding container 239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7: Status 404 returned error can't find the container with id 239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.882071 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hl5mz"] Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.914945 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.915197 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="prometheus" containerID="cri-o://19bd0bc5a80f37d207a3d873f7b7c86098312bc33d67181f4d441a6568ba051c" gracePeriod=600 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.915595 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="thanos-sidecar" containerID="cri-o://0c07116d4edac872b85967f4d4208c5c0b943a30727ba564b3e125536795746b" gracePeriod=600 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.915650 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="config-reloader" containerID="cri-o://a7109c4f65c82d137894f5bb0154f780f944846620a6e73eded15c6d36491d88" gracePeriod=600 Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.932321 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-njwfb" Dec 05 00:45:29 crc kubenswrapper[4759]: I1205 00:45:29.947583 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x67vg"] Dec 05 00:45:29 crc kubenswrapper[4759]: W1205 00:45:29.952163 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d07555_31e9_4b86_b4eb_56b184aef5b7.slice/crio-a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174 WatchSource:0}: Error finding container a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174: Status 404 returned error can't find the container with id a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174 Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.049484 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts\") pod \"859f7f90-7f35-4c95-80fd-240e91834ff6\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.049586 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xwj8\" (UniqueName: \"kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8\") pod \"859f7f90-7f35-4c95-80fd-240e91834ff6\" (UID: \"859f7f90-7f35-4c95-80fd-240e91834ff6\") " Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.052399 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "859f7f90-7f35-4c95-80fd-240e91834ff6" (UID: "859f7f90-7f35-4c95-80fd-240e91834ff6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.055577 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8" (OuterVolumeSpecName: "kube-api-access-4xwj8") pod "859f7f90-7f35-4c95-80fd-240e91834ff6" (UID: "859f7f90-7f35-4c95-80fd-240e91834ff6"). InnerVolumeSpecName "kube-api-access-4xwj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.151905 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f7f90-7f35-4c95-80fd-240e91834ff6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.151943 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xwj8\" (UniqueName: \"kubernetes.io/projected/859f7f90-7f35-4c95-80fd-240e91834ff6-kube-api-access-4xwj8\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376583 4759 generic.go:334] "Generic (PLEG): container finished" podID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerID="0c07116d4edac872b85967f4d4208c5c0b943a30727ba564b3e125536795746b" exitCode=0 Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376608 4759 generic.go:334] "Generic (PLEG): container finished" podID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerID="a7109c4f65c82d137894f5bb0154f780f944846620a6e73eded15c6d36491d88" exitCode=0 Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376616 4759 generic.go:334] "Generic (PLEG): container finished" podID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerID="19bd0bc5a80f37d207a3d873f7b7c86098312bc33d67181f4d441a6568ba051c" exitCode=0 Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376674 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerDied","Data":"0c07116d4edac872b85967f4d4208c5c0b943a30727ba564b3e125536795746b"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376724 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerDied","Data":"a7109c4f65c82d137894f5bb0154f780f944846620a6e73eded15c6d36491d88"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.376741 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerDied","Data":"19bd0bc5a80f37d207a3d873f7b7c86098312bc33d67181f4d441a6568ba051c"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.380056 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"513af94f240d922705e5e2a0e8fe31099e69f7fc96ecb2cadb188c760ccd1fe0"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.382358 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hl5mz" event={"ID":"538f07d4-2a1a-47e9-aec3-161f7b23af6f","Type":"ContainerStarted","Data":"239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.388120 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"242736a8-8641-4915-b5cb-e271ce361e3a","Type":"ContainerStarted","Data":"0a1a4e5c3bc1eca49354588cfc7344eb89d87c0a7aade8c9f4ff3a461fd2d7d9"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.400906 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x67vg" event={"ID":"e5d07555-31e9-4b86-b4eb-56b184aef5b7","Type":"ContainerStarted","Data":"a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.404894 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.246260764 podStartE2EDuration="5.404884331s" podCreationTimestamp="2025-12-05 00:45:25 +0000 UTC" firstStartedPulling="2025-12-05 00:45:27.048627671 +0000 UTC m=+1346.264288621" lastFinishedPulling="2025-12-05 00:45:29.207251238 +0000 UTC m=+1348.422912188" observedRunningTime="2025-12-05 00:45:30.401479987 +0000 UTC m=+1349.617140927" watchObservedRunningTime="2025-12-05 00:45:30.404884331 +0000 UTC m=+1349.620545281" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.416214 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-njwfb" event={"ID":"859f7f90-7f35-4c95-80fd-240e91834ff6","Type":"ContainerDied","Data":"595cb69a0833ab0e747c2388f13fd961bb795ff3cc354cf491852ab7bf9e253d"} Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.416422 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="595cb69a0833ab0e747c2388f13fd961bb795ff3cc354cf491852ab7bf9e253d" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.416505 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-njwfb" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.939878 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.967516 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts\") pod \"5e1f06ab-529e-4253-bf17-1255b1226d3f\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.967855 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glp8m\" (UniqueName: \"kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m\") pod \"5e1f06ab-529e-4253-bf17-1255b1226d3f\" (UID: \"5e1f06ab-529e-4253-bf17-1255b1226d3f\") " Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.968254 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e1f06ab-529e-4253-bf17-1255b1226d3f" (UID: "5e1f06ab-529e-4253-bf17-1255b1226d3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.970367 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1f06ab-529e-4253-bf17-1255b1226d3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:30 crc kubenswrapper[4759]: I1205 00:45:30.994517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m" (OuterVolumeSpecName: "kube-api-access-glp8m") pod "5e1f06ab-529e-4253-bf17-1255b1226d3f" (UID: "5e1f06ab-529e-4253-bf17-1255b1226d3f"). InnerVolumeSpecName "kube-api-access-glp8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.072599 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glp8m\" (UniqueName: \"kubernetes.io/projected/5e1f06ab-529e-4253-bf17-1255b1226d3f-kube-api-access-glp8m\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.179805 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.215993 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.223207 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.242968 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.249554 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.266821 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275576 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts\") pod \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275617 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts\") pod \"beaae381-be72-48be-8a0e-eba21df779b7\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275657 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47rbn\" (UniqueName: \"kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn\") pod \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275693 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts\") pod \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\" (UID: \"57dcf39f-2def-45a4-b9f7-b9138e9a1a64\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275727 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnzxw\" (UniqueName: \"kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw\") pod \"beaae381-be72-48be-8a0e-eba21df779b7\" (UID: \"beaae381-be72-48be-8a0e-eba21df779b7\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275772 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zsv\" (UniqueName: \"kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv\") pod \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\" (UID: \"035a8e83-7e26-4eb5-939c-4c70a2c86d94\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275849 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts\") pod \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275869 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq\") pod \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\" (UID: \"4a213f68-4a29-4f0d-8f47-fb47a7ed1769\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275892 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts\") pod \"179ac88c-cac6-44a8-9fa5-54bed12d118c\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.275946 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktlmg\" (UniqueName: \"kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg\") pod \"179ac88c-cac6-44a8-9fa5-54bed12d118c\" (UID: \"179ac88c-cac6-44a8-9fa5-54bed12d118c\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.276078 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "035a8e83-7e26-4eb5-939c-4c70a2c86d94" (UID: "035a8e83-7e26-4eb5-939c-4c70a2c86d94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.276122 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "beaae381-be72-48be-8a0e-eba21df779b7" (UID: "beaae381-be72-48be-8a0e-eba21df779b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.276659 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "179ac88c-cac6-44a8-9fa5-54bed12d118c" (UID: "179ac88c-cac6-44a8-9fa5-54bed12d118c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.278642 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57dcf39f-2def-45a4-b9f7-b9138e9a1a64" (UID: "57dcf39f-2def-45a4-b9f7-b9138e9a1a64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.278657 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/035a8e83-7e26-4eb5-939c-4c70a2c86d94-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.278690 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a213f68-4a29-4f0d-8f47-fb47a7ed1769" (UID: "4a213f68-4a29-4f0d-8f47-fb47a7ed1769"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.278707 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaae381-be72-48be-8a0e-eba21df779b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.278722 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179ac88c-cac6-44a8-9fa5-54bed12d118c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.283650 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw" (OuterVolumeSpecName: "kube-api-access-tnzxw") pod "beaae381-be72-48be-8a0e-eba21df779b7" (UID: "beaae381-be72-48be-8a0e-eba21df779b7"). InnerVolumeSpecName "kube-api-access-tnzxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.287376 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq" (OuterVolumeSpecName: "kube-api-access-zb5xq") pod "4a213f68-4a29-4f0d-8f47-fb47a7ed1769" (UID: "4a213f68-4a29-4f0d-8f47-fb47a7ed1769"). InnerVolumeSpecName "kube-api-access-zb5xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.289033 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg" (OuterVolumeSpecName: "kube-api-access-ktlmg") pod "179ac88c-cac6-44a8-9fa5-54bed12d118c" (UID: "179ac88c-cac6-44a8-9fa5-54bed12d118c"). InnerVolumeSpecName "kube-api-access-ktlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.289917 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn" (OuterVolumeSpecName: "kube-api-access-47rbn") pod "57dcf39f-2def-45a4-b9f7-b9138e9a1a64" (UID: "57dcf39f-2def-45a4-b9f7-b9138e9a1a64"). InnerVolumeSpecName "kube-api-access-47rbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.290075 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv" (OuterVolumeSpecName: "kube-api-access-s7zsv") pod "035a8e83-7e26-4eb5-939c-4c70a2c86d94" (UID: "035a8e83-7e26-4eb5-939c-4c70a2c86d94"). InnerVolumeSpecName "kube-api-access-s7zsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.304917 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.380834 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381206 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381232 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts\") pod \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381258 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381389 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381410 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381463 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22l8\" (UniqueName: \"kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8\") pod \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\" (UID: \"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381497 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381532 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhd88\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381549 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config\") pod \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\" (UID: \"4700495a-dd6c-4b5d-a290-55ab3907a2f5\") " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381915 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381928 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/4a213f68-4a29-4f0d-8f47-fb47a7ed1769-kube-api-access-zb5xq\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381939 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktlmg\" (UniqueName: \"kubernetes.io/projected/179ac88c-cac6-44a8-9fa5-54bed12d118c-kube-api-access-ktlmg\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381948 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47rbn\" (UniqueName: \"kubernetes.io/projected/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-kube-api-access-47rbn\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381956 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57dcf39f-2def-45a4-b9f7-b9138e9a1a64-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381966 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnzxw\" (UniqueName: \"kubernetes.io/projected/beaae381-be72-48be-8a0e-eba21df779b7-kube-api-access-tnzxw\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.381975 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zsv\" (UniqueName: \"kubernetes.io/projected/035a8e83-7e26-4eb5-939c-4c70a2c86d94-kube-api-access-s7zsv\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.383349 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.385486 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" (UID: "4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.386211 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.387347 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.387389 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config" (OuterVolumeSpecName: "config") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.387807 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88" (OuterVolumeSpecName: "kube-api-access-qhd88") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "kube-api-access-qhd88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.388842 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8" (OuterVolumeSpecName: "kube-api-access-b22l8") pod "4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" (UID: "4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb"). InnerVolumeSpecName "kube-api-access-b22l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.414470 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.423394 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out" (OuterVolumeSpecName: "config-out") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.429087 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config" (OuterVolumeSpecName: "web-config") pod "4700495a-dd6c-4b5d-a290-55ab3907a2f5" (UID: "4700495a-dd6c-4b5d-a290-55ab3907a2f5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.441270 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbe7-account-create-update-tmcgm" event={"ID":"5e1f06ab-529e-4253-bf17-1255b1226d3f","Type":"ContainerDied","Data":"101f8569a98bc8fa5d1dbcc4c84b0f278e6b890f67814168073fecaee9802733"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.441403 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101f8569a98bc8fa5d1dbcc4c84b0f278e6b890f67814168073fecaee9802733" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.441515 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbe7-account-create-update-tmcgm" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.450453 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2b80-account-create-update-8nxsz" event={"ID":"57dcf39f-2def-45a4-b9f7-b9138e9a1a64","Type":"ContainerDied","Data":"9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.450558 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9daac2211a4849adb2761af0c09fa1f57a628cd1f1306f3d0090225ece40a3f9" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.450615 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2b80-account-create-update-8nxsz" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.467105 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rn7b5" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.467664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rn7b5" event={"ID":"4a213f68-4a29-4f0d-8f47-fb47a7ed1769","Type":"ContainerDied","Data":"aba877ca7b40f82d9bdf0905e5c7b6415b47d70ef85f7600ad458127e21da3f0"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.467839 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba877ca7b40f82d9bdf0905e5c7b6415b47d70ef85f7600ad458127e21da3f0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483794 4759 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-web-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483835 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483846 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483854 4759 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config-out\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483863 4759 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4700495a-dd6c-4b5d-a290-55ab3907a2f5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483873 4759 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483884 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22l8\" (UniqueName: \"kubernetes.io/projected/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb-kube-api-access-b22l8\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483894 4759 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483903 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhd88\" (UniqueName: \"kubernetes.io/projected/4700495a-dd6c-4b5d-a290-55ab3907a2f5-kube-api-access-qhd88\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.483911 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4700495a-dd6c-4b5d-a290-55ab3907a2f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.500555 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"b1ea626e85006057921ba07c81209f8728c0b2c61fbb9d56daad68f677e40933"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.505649 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3781-account-create-update-2sknq" event={"ID":"beaae381-be72-48be-8a0e-eba21df779b7","Type":"ContainerDied","Data":"01b646635b3d0a4e19632df6890606f34b62d1e02493ab54d95570d9c86c4775"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.505700 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b646635b3d0a4e19632df6890606f34b62d1e02493ab54d95570d9c86c4775" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.505757 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3781-account-create-update-2sknq" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.513921 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.517222 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5cmtn" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.517563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5cmtn" event={"ID":"179ac88c-cac6-44a8-9fa5-54bed12d118c","Type":"ContainerDied","Data":"cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.517608 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0343114820c897dd522fb52e834160621b8a933d2c5e06f7e665be710bdef7" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.519501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4vclp" event={"ID":"035a8e83-7e26-4eb5-939c-4c70a2c86d94","Type":"ContainerDied","Data":"7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.519534 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9e5ba3f913ef21ccf6bd60df7b451a6137d03b79ba1628d0773d968a3c2d0d" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.519584 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4vclp" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.529045 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4700495a-dd6c-4b5d-a290-55ab3907a2f5","Type":"ContainerDied","Data":"93fba10ffa67c0221fb64c3463234950e2c4b272b2f60bfa2adf190698baaeb4"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.529093 4759 scope.go:117] "RemoveContainer" containerID="0c07116d4edac872b85967f4d4208c5c0b943a30727ba564b3e125536795746b" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.529228 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.545232 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6349-account-create-update-7kk9v" event={"ID":"4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb","Type":"ContainerDied","Data":"44b5f6fd941e695a776bab3a56e3ad0c38ff79b09d7d5f75c167804bff0fe8dc"} Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.545253 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6349-account-create-update-7kk9v" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.545268 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b5f6fd941e695a776bab3a56e3ad0c38ff79b09d7d5f75c167804bff0fe8dc" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.573475 4759 scope.go:117] "RemoveContainer" containerID="a7109c4f65c82d137894f5bb0154f780f944846620a6e73eded15c6d36491d88" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.585994 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.593403 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.607341 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.615796 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616157 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616168 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616183 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a213f68-4a29-4f0d-8f47-fb47a7ed1769" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616189 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a213f68-4a29-4f0d-8f47-fb47a7ed1769" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616202 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="thanos-sidecar" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616208 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="thanos-sidecar" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616222 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035a8e83-7e26-4eb5-939c-4c70a2c86d94" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616228 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="035a8e83-7e26-4eb5-939c-4c70a2c86d94" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616244 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1f06ab-529e-4253-bf17-1255b1226d3f" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616249 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1f06ab-529e-4253-bf17-1255b1226d3f" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616262 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="config-reloader" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616267 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="config-reloader" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616274 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="859f7f90-7f35-4c95-80fd-240e91834ff6" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616280 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="859f7f90-7f35-4c95-80fd-240e91834ff6" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616291 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="prometheus" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616296 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="prometheus" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616328 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="init-config-reloader" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616334 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="init-config-reloader" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616348 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaae381-be72-48be-8a0e-eba21df779b7" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616354 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaae381-be72-48be-8a0e-eba21df779b7" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616364 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179ac88c-cac6-44a8-9fa5-54bed12d118c" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616369 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="179ac88c-cac6-44a8-9fa5-54bed12d118c" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: E1205 00:45:31.616384 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dcf39f-2def-45a4-b9f7-b9138e9a1a64" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616389 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dcf39f-2def-45a4-b9f7-b9138e9a1a64" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616624 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="859f7f90-7f35-4c95-80fd-240e91834ff6" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616637 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaae381-be72-48be-8a0e-eba21df779b7" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616649 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="config-reloader" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616656 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="prometheus" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616668 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="179ac88c-cac6-44a8-9fa5-54bed12d118c" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616675 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1f06ab-529e-4253-bf17-1255b1226d3f" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616683 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dcf39f-2def-45a4-b9f7-b9138e9a1a64" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616694 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a213f68-4a29-4f0d-8f47-fb47a7ed1769" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616704 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" containerName="mariadb-account-create-update" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616713 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="thanos-sidecar" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.616721 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="035a8e83-7e26-4eb5-939c-4c70a2c86d94" containerName="mariadb-database-create" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.618409 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.620402 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.622580 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.622701 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.625802 4759 scope.go:117] "RemoveContainer" containerID="19bd0bc5a80f37d207a3d873f7b7c86098312bc33d67181f4d441a6568ba051c" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.627798 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.627955 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-jvk29" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.628892 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.629078 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.678073 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807327 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807408 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807495 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807556 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df6b2930-dd32-49ef-a13e-2329eba38827-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807590 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbccm\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-kube-api-access-jbccm\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807681 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807737 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807762 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df6b2930-dd32-49ef-a13e-2329eba38827-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807802 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807830 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.807852 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.860478 4759 scope.go:117] "RemoveContainer" containerID="beaf554dd9ed722fdef00beb81326a1f73be3c578c893d0dc8a1084ae8d3dcd0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.910216 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.910521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.910634 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df6b2930-dd32-49ef-a13e-2329eba38827-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.910755 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.910893 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911023 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911146 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911214 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911327 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911466 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df6b2930-dd32-49ef-a13e-2329eba38827-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.911535 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbccm\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-kube-api-access-jbccm\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.914724 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.924766 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df6b2930-dd32-49ef-a13e-2329eba38827-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.941532 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df6b2930-dd32-49ef-a13e-2329eba38827-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.943050 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.948949 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.962911 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-config\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:31 crc kubenswrapper[4759]: I1205 00:45:31.975012 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:31.996665 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbccm\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-kube-api-access-jbccm\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:31.996907 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df6b2930-dd32-49ef-a13e-2329eba38827-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:31.997378 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.001084 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df6b2930-dd32-49ef-a13e-2329eba38827-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.018161 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"df6b2930-dd32-49ef-a13e-2329eba38827\") " pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.136327 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.569380 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"6ea2489cd389f13257e4708bcc2c6aadd4937e661664db02b608393cfa05aea8"} Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.569731 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"27a1ebabe2ee22a7ed6c2d12897c8da10ec7d32852069aad03809520763983cc"} Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.569744 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"9b284b0c9928b3f39b035ea279f63ecd9ab21df4180756f42b67959fdc0bf0a3"} Dec 05 00:45:32 crc kubenswrapper[4759]: I1205 00:45:32.804736 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 05 00:45:33 crc kubenswrapper[4759]: I1205 00:45:33.168025 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" path="/var/lib/kubelet/pods/4700495a-dd6c-4b5d-a290-55ab3907a2f5/volumes" Dec 05 00:45:33 crc kubenswrapper[4759]: I1205 00:45:33.581496 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerStarted","Data":"4bc71c13ccdd4862f21e0e5736b0e301470df3c8db3a7ce3e95901235928dcdc"} Dec 05 00:45:34 crc kubenswrapper[4759]: I1205 00:45:34.074627 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="4700495a-dd6c-4b5d-a290-55ab3907a2f5" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.126:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:45:36 crc kubenswrapper[4759]: I1205 00:45:36.621123 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerStarted","Data":"2daadb5a51e5e83b89c1cebbc9a544751aca70a91f68f5a01052d8e8c30fbdf1"} Dec 05 00:45:37 crc kubenswrapper[4759]: I1205 00:45:37.639755 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"1b09bf0a1b314aa10802546089e525e5aee77edad223a5103f193fc2929375c4"} Dec 05 00:45:37 crc kubenswrapper[4759]: I1205 00:45:37.640039 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"4ece180c77b7109bffc916e30302080ed6b035d84e6e9cfb14019cdc3d13ad63"} Dec 05 00:45:37 crc kubenswrapper[4759]: I1205 00:45:37.643110 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x67vg" event={"ID":"e5d07555-31e9-4b86-b4eb-56b184aef5b7","Type":"ContainerStarted","Data":"666293e33e341c27b9434994b054f34fc0bbf6d90907a912b42ae4b2c2086123"} Dec 05 00:45:37 crc kubenswrapper[4759]: I1205 00:45:37.673533 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-x67vg" podStartSLOduration=3.035632335 podStartE2EDuration="9.673515383s" podCreationTimestamp="2025-12-05 00:45:28 +0000 UTC" firstStartedPulling="2025-12-05 00:45:29.954527273 +0000 UTC m=+1349.170188223" lastFinishedPulling="2025-12-05 00:45:36.592410321 +0000 UTC m=+1355.808071271" observedRunningTime="2025-12-05 00:45:37.66725595 +0000 UTC m=+1356.882916900" watchObservedRunningTime="2025-12-05 00:45:37.673515383 +0000 UTC m=+1356.889176333" Dec 05 00:45:42 crc kubenswrapper[4759]: E1205 00:45:42.579043 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf6b2930_dd32_49ef_a13e_2329eba38827.slice/crio-conmon-2daadb5a51e5e83b89c1cebbc9a544751aca70a91f68f5a01052d8e8c30fbdf1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf6b2930_dd32_49ef_a13e_2329eba38827.slice/crio-2daadb5a51e5e83b89c1cebbc9a544751aca70a91f68f5a01052d8e8c30fbdf1.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:45:42 crc kubenswrapper[4759]: I1205 00:45:42.693411 4759 generic.go:334] "Generic (PLEG): container finished" podID="df6b2930-dd32-49ef-a13e-2329eba38827" containerID="2daadb5a51e5e83b89c1cebbc9a544751aca70a91f68f5a01052d8e8c30fbdf1" exitCode=0 Dec 05 00:45:42 crc kubenswrapper[4759]: I1205 00:45:42.693453 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerDied","Data":"2daadb5a51e5e83b89c1cebbc9a544751aca70a91f68f5a01052d8e8c30fbdf1"} Dec 05 00:45:43 crc kubenswrapper[4759]: I1205 00:45:43.706365 4759 generic.go:334] "Generic (PLEG): container finished" podID="e5d07555-31e9-4b86-b4eb-56b184aef5b7" containerID="666293e33e341c27b9434994b054f34fc0bbf6d90907a912b42ae4b2c2086123" exitCode=0 Dec 05 00:45:43 crc kubenswrapper[4759]: I1205 00:45:43.706460 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x67vg" event={"ID":"e5d07555-31e9-4b86-b4eb-56b184aef5b7","Type":"ContainerDied","Data":"666293e33e341c27b9434994b054f34fc0bbf6d90907a912b42ae4b2c2086123"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.089536 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.188153 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tkc\" (UniqueName: \"kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc\") pod \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.188242 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data\") pod \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.188270 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle\") pod \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\" (UID: \"e5d07555-31e9-4b86-b4eb-56b184aef5b7\") " Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.192509 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc" (OuterVolumeSpecName: "kube-api-access-24tkc") pod "e5d07555-31e9-4b86-b4eb-56b184aef5b7" (UID: "e5d07555-31e9-4b86-b4eb-56b184aef5b7"). InnerVolumeSpecName "kube-api-access-24tkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.220702 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5d07555-31e9-4b86-b4eb-56b184aef5b7" (UID: "e5d07555-31e9-4b86-b4eb-56b184aef5b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.241562 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data" (OuterVolumeSpecName: "config-data") pod "e5d07555-31e9-4b86-b4eb-56b184aef5b7" (UID: "e5d07555-31e9-4b86-b4eb-56b184aef5b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.291235 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24tkc\" (UniqueName: \"kubernetes.io/projected/e5d07555-31e9-4b86-b4eb-56b184aef5b7-kube-api-access-24tkc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.291265 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.291275 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d07555-31e9-4b86-b4eb-56b184aef5b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.732901 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerStarted","Data":"3ae7fa5f36a9a178a8a546eb04a4668a33820d0086386158ac00d4e4cc10ca10"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.738403 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x67vg" event={"ID":"e5d07555-31e9-4b86-b4eb-56b184aef5b7","Type":"ContainerDied","Data":"a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.738460 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x67vg" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.738466 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9289c40402da4c95a2a36b05f1dd12f8b150dd7c72d31d7cc8da7608fd11174" Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.760447 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"3fb83dc8824d7c3ec579df96c5764efefba33dbd13edb68feacbca9b455b90e4"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.760487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"7d9108233fb3f85287e4b9d03516570bb0663b07fc77c5e0122e551da040daf5"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.760498 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"8ef6604a69542ff33d86e48e4a8ef0c5d853ceb4b6fa36391897a3dab17e44f5"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.760506 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"01b95b387e3c3622241ee38b580ef315c80a698cf9d56735741b465fa26b9f76"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.766130 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hl5mz" event={"ID":"538f07d4-2a1a-47e9-aec3-161f7b23af6f","Type":"ContainerStarted","Data":"54384dd13cc9b46927d505d59e71f259dcb3239c9d1081840a9aba7e9c8ef0c2"} Dec 05 00:45:45 crc kubenswrapper[4759]: I1205 00:45:45.782953 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hl5mz" podStartSLOduration=2.897683419 podStartE2EDuration="17.782937558s" podCreationTimestamp="2025-12-05 00:45:28 +0000 UTC" firstStartedPulling="2025-12-05 00:45:29.884439593 +0000 UTC m=+1349.100100543" lastFinishedPulling="2025-12-05 00:45:44.769693732 +0000 UTC m=+1363.985354682" observedRunningTime="2025-12-05 00:45:45.777870884 +0000 UTC m=+1364.993531834" watchObservedRunningTime="2025-12-05 00:45:45.782937558 +0000 UTC m=+1364.998598508" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.001587 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:46 crc kubenswrapper[4759]: E1205 00:45:46.002194 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d07555-31e9-4b86-b4eb-56b184aef5b7" containerName="keystone-db-sync" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.002209 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d07555-31e9-4b86-b4eb-56b184aef5b7" containerName="keystone-db-sync" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.002394 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d07555-31e9-4b86-b4eb-56b184aef5b7" containerName="keystone-db-sync" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.006478 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.019018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4q7\" (UniqueName: \"kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.019055 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.019121 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.019153 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.019178 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.071364 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.086290 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7wdgc"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.087589 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.092756 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.096965 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.097172 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.097684 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.097692 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nl4ht" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121380 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121487 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvn6\" (UniqueName: \"kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4q7\" (UniqueName: \"kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121540 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121583 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121603 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121620 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121687 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121770 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121829 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.121858 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.123228 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.123529 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.123588 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7wdgc"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.123962 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.124409 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.196230 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4q7\" (UniqueName: \"kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7\") pod \"dnsmasq-dns-5c9d85d47c-4d648\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227343 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227396 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvn6\" (UniqueName: \"kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227421 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227436 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227451 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.227473 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.234590 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.236129 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.241450 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.241491 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.261337 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-xbx7b"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.261922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.262691 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.277002 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9jwlj" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.284785 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.294571 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xbx7b"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.301894 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvn6\" (UniqueName: \"kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6\") pod \"keystone-bootstrap-7wdgc\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.322756 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.398359 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vvtw7"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.399734 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.406689 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.425534 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vvtw7"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.430805 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k5k95" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.431887 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.450501 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.459753 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.459848 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfz4x\" (UniqueName: \"kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.459902 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.459923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.459953 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttwn\" (UniqueName: \"kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.493868 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wdw54"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.495037 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.500719 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.513242 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t9gmd" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.513555 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.535291 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wdw54"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567345 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567385 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567425 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567448 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567474 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567499 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfz4x\" (UniqueName: \"kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567525 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567553 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567574 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttwn\" (UniqueName: \"kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567616 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567633 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.567650 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7vnd\" (UniqueName: \"kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.571190 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s5c5z"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.572513 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.576852 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.577063 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.579347 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.581826 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.584986 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.591216 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dslrx" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.591877 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.592266 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.618871 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfz4x\" (UniqueName: \"kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x\") pod \"barbican-db-sync-vvtw7\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.626009 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttwn\" (UniqueName: \"kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn\") pod \"heat-db-sync-xbx7b\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.630733 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s5c5z"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.635724 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xbx7b" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.653493 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.656226 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.672823 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tn5x\" (UniqueName: \"kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.672898 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.672923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.672947 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.672970 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673081 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673128 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673187 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673215 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673275 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673334 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673475 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4q8\" (UniqueName: \"kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673491 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673515 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7vnd\" (UniqueName: \"kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673543 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.673588 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.677834 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.678014 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.734693 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.745245 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.769429 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776519 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776560 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776594 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4q8\" (UniqueName: \"kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776667 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776712 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776737 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tn5x\" (UniqueName: \"kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776771 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776790 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776825 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.776891 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.782692 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.789953 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.793265 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.796191 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.796428 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.807628 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.808969 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dx9np"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.815636 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.854729 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-299vg" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.859872 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.860827 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.860937 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.863383 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7vnd\" (UniqueName: \"kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd\") pod \"cinder-db-sync-wdw54\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.866873 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.867008 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4q8\" (UniqueName: \"kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8\") pod \"dnsmasq-dns-6ffb94d8ff-bmk82\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.867397 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wdw54" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.878763 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.879051 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tn5x\" (UniqueName: \"kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.879519 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle\") pod \"placement-db-sync-s5c5z\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.898962 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dx9np"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.901849 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s5c5z" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.913001 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"28edaf49-80c4-4732-a19f-1f2348fcd8e7","Type":"ContainerStarted","Data":"32850fe851880396bc1e6de3a9b62422e36e41cb070dc1bcfc3b199f1b196fa5"} Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.918929 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.938224 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.947911 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.948151 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.958166 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.980337 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8h7m\" (UniqueName: \"kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.981961 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:46 crc kubenswrapper[4759]: I1205 00:45:46.981993 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.026496 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.511064801 podStartE2EDuration="58.026481578s" podCreationTimestamp="2025-12-05 00:44:49 +0000 UTC" firstStartedPulling="2025-12-05 00:45:24.073685803 +0000 UTC m=+1343.289346753" lastFinishedPulling="2025-12-05 00:45:36.58910258 +0000 UTC m=+1355.804763530" observedRunningTime="2025-12-05 00:45:46.961023932 +0000 UTC m=+1366.176684882" watchObservedRunningTime="2025-12-05 00:45:47.026481578 +0000 UTC m=+1366.242142528" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084187 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084516 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084567 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8h7m\" (UniqueName: \"kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084613 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084646 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c26c7\" (UniqueName: \"kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084686 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084728 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084753 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084817 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.084865 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.092399 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.093294 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.100547 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8h7m\" (UniqueName: \"kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m\") pod \"neutron-db-sync-dx9np\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.110847 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188156 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188218 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c26c7\" (UniqueName: \"kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188251 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188356 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188397 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188415 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188451 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.188925 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.189702 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.269987 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.295891 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.297787 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.306972 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.315239 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.335611 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dx9np" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.348132 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.352966 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.354490 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.355150 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.363141 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c26c7\" (UniqueName: \"kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7\") pod \"ceilometer-0\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.376346 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7wdgc"] Dec 05 00:45:47 crc kubenswrapper[4759]: W1205 00:45:47.393413 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7be9242_6af1_4987_8d25_b31a4eaa7980.slice/crio-69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0 WatchSource:0}: Error finding container 69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0: Status 404 returned error can't find the container with id 69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0 Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398431 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ln7d\" (UniqueName: \"kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398518 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398551 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398568 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398729 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.398748 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.499986 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.500338 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.500358 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.500447 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.500477 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.500524 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ln7d\" (UniqueName: \"kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.501711 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.502220 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.502730 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.505776 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.506478 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.523256 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ln7d\" (UniqueName: \"kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d\") pod \"dnsmasq-dns-fcfdd6f9f-4v2kj\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.529156 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.644462 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.649722 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.937501 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s5c5z"] Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.941918 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerStarted","Data":"983b6e9c2510747a6b416b2a1093019a6d8cfc8639681e8241ee7fdf0cefee92"} Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.943466 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wdgc" event={"ID":"a7be9242-6af1-4987-8d25-b31a4eaa7980","Type":"ContainerStarted","Data":"cec3ad6a10e6c07bb5dc6c263fe2871b09d752d55b9b4389d1b9211358ac3d53"} Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.943495 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wdgc" event={"ID":"a7be9242-6af1-4987-8d25-b31a4eaa7980","Type":"ContainerStarted","Data":"69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0"} Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.948849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" event={"ID":"7587133a-02a0-45a8-8753-e60338f66922","Type":"ContainerStarted","Data":"e0b0bf7b2e08811d818b6b1c381a947b78d053b6de8e19f844405779d2cdcbdb"} Dec 05 00:45:47 crc kubenswrapper[4759]: I1205 00:45:47.967686 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xbx7b"] Dec 05 00:45:47 crc kubenswrapper[4759]: W1205 00:45:47.996422 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bcbbf13_82ed_4c4a_8694_d3149d730cb0.slice/crio-2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28 WatchSource:0}: Error finding container 2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28: Status 404 returned error can't find the container with id 2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28 Dec 05 00:45:48 crc kubenswrapper[4759]: W1205 00:45:48.008434 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe2c3db_f452_4009_abca_b9ee975ad38d.slice/crio-dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be WatchSource:0}: Error finding container dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be: Status 404 returned error can't find the container with id dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.024084 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vvtw7"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.035029 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wdw54"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.045964 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7wdgc" podStartSLOduration=3.045945367 podStartE2EDuration="3.045945367s" podCreationTimestamp="2025-12-05 00:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:47.978207034 +0000 UTC m=+1367.193867984" watchObservedRunningTime="2025-12-05 00:45:48.045945367 +0000 UTC m=+1367.261606317" Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.078079 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.100578 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dx9np"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.362948 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.452473 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.500955 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.983278 4759 generic.go:334] "Generic (PLEG): container finished" podID="0bea7702-cf77-4c94-94a4-f5201e080ffd" containerID="49668a4366b4882d6e53d415de0dbce34154a254b1fac6b5d042e0392ed33b93" exitCode=0 Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.983358 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" event={"ID":"0bea7702-cf77-4c94-94a4-f5201e080ffd","Type":"ContainerDied","Data":"49668a4366b4882d6e53d415de0dbce34154a254b1fac6b5d042e0392ed33b93"} Dec 05 00:45:48 crc kubenswrapper[4759]: I1205 00:45:48.983386 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" event={"ID":"0bea7702-cf77-4c94-94a4-f5201e080ffd","Type":"ContainerStarted","Data":"d7c89f4d0db1f67de5b2b1a40e45bd319101f3a48f4b6af3fd35646562e82e15"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.025015 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dx9np" event={"ID":"885cf08d-63c8-45da-ab2a-18b28a9b0f40","Type":"ContainerStarted","Data":"c2374cb010257bf2b734839ed8de394e9a0ea247c389ee1a58a00c206bf27279"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.025480 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dx9np" event={"ID":"885cf08d-63c8-45da-ab2a-18b28a9b0f40","Type":"ContainerStarted","Data":"1a7ac24f3f72b0df5301e66e750c85bea98624b1a30b7c09abb16ff5785000bd"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.033906 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wdw54" event={"ID":"8fe2c3db-f452-4009-abca-b9ee975ad38d","Type":"ContainerStarted","Data":"dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.051023 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerStarted","Data":"da6a161f851242ffea24d247a269e9d11dae25a1465b018b5749bdecb109e157"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.051072 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerStarted","Data":"3b5652f00503b8383b78a1195ff3a0a40f44ec942c6dbd1d6f223e4806465ff8"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.059343 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s5c5z" event={"ID":"8bcbbf13-82ed-4c4a-8694-d3149d730cb0","Type":"ContainerStarted","Data":"2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.066756 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dx9np" podStartSLOduration=3.066736529 podStartE2EDuration="3.066736529s" podCreationTimestamp="2025-12-05 00:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:49.060963066 +0000 UTC m=+1368.276624016" watchObservedRunningTime="2025-12-05 00:45:49.066736529 +0000 UTC m=+1368.282397479" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.069817 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"df6b2930-dd32-49ef-a13e-2329eba38827","Type":"ContainerStarted","Data":"1adbf9fabf2d45ea4461f96466f0952f32f6c0f57d1aaf02cd3c8ca5b4a2c8a9"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.076633 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerStarted","Data":"82cf2cef856fb66be7c38fcee33f05d0ebc7d0fea566212ac94ca43deb2fb9eb"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.078690 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vvtw7" event={"ID":"419ed25e-6ca1-4ca7-978e-7b1464982278","Type":"ContainerStarted","Data":"4a7604134582f4690bdc51ccc90dee03028b0e968411de39a0599a9ac942f6c1"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.114366 4759 generic.go:334] "Generic (PLEG): container finished" podID="7587133a-02a0-45a8-8753-e60338f66922" containerID="bd2cf37a73e08c63291da35ea7004840c40ee65b96a24da1030e68b09e664c15" exitCode=0 Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.114452 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" event={"ID":"7587133a-02a0-45a8-8753-e60338f66922","Type":"ContainerDied","Data":"bd2cf37a73e08c63291da35ea7004840c40ee65b96a24da1030e68b09e664c15"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.122580 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xbx7b" event={"ID":"ed73e23b-4161-4968-93d0-aaabce1aa4bb","Type":"ContainerStarted","Data":"1cf64d4bc0b68716976321762bf0b25a7e4dc5356d33ea56dd706a6db951abb6"} Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.141955 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.141939895 podStartE2EDuration="18.141939895s" podCreationTimestamp="2025-12-05 00:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:49.128753881 +0000 UTC m=+1368.344414831" watchObservedRunningTime="2025-12-05 00:45:49.141939895 +0000 UTC m=+1368.357600845" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.680426 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.691321 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.785066 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc\") pod \"7587133a-02a0-45a8-8753-e60338f66922\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786278 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config\") pod \"7587133a-02a0-45a8-8753-e60338f66922\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786334 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr4q7\" (UniqueName: \"kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7\") pod \"7587133a-02a0-45a8-8753-e60338f66922\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786350 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4q8\" (UniqueName: \"kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8\") pod \"0bea7702-cf77-4c94-94a4-f5201e080ffd\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786381 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc\") pod \"0bea7702-cf77-4c94-94a4-f5201e080ffd\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786441 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config\") pod \"0bea7702-cf77-4c94-94a4-f5201e080ffd\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786508 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb\") pod \"7587133a-02a0-45a8-8753-e60338f66922\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786534 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb\") pod \"0bea7702-cf77-4c94-94a4-f5201e080ffd\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786583 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb\") pod \"7587133a-02a0-45a8-8753-e60338f66922\" (UID: \"7587133a-02a0-45a8-8753-e60338f66922\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.786603 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb\") pod \"0bea7702-cf77-4c94-94a4-f5201e080ffd\" (UID: \"0bea7702-cf77-4c94-94a4-f5201e080ffd\") " Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.798321 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7" (OuterVolumeSpecName: "kube-api-access-zr4q7") pod "7587133a-02a0-45a8-8753-e60338f66922" (UID: "7587133a-02a0-45a8-8753-e60338f66922"). InnerVolumeSpecName "kube-api-access-zr4q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.822972 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8" (OuterVolumeSpecName: "kube-api-access-qv4q8") pod "0bea7702-cf77-4c94-94a4-f5201e080ffd" (UID: "0bea7702-cf77-4c94-94a4-f5201e080ffd"). InnerVolumeSpecName "kube-api-access-qv4q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.826232 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7587133a-02a0-45a8-8753-e60338f66922" (UID: "7587133a-02a0-45a8-8753-e60338f66922"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.835010 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7587133a-02a0-45a8-8753-e60338f66922" (UID: "7587133a-02a0-45a8-8753-e60338f66922"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.840323 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0bea7702-cf77-4c94-94a4-f5201e080ffd" (UID: "0bea7702-cf77-4c94-94a4-f5201e080ffd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.851520 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config" (OuterVolumeSpecName: "config") pod "7587133a-02a0-45a8-8753-e60338f66922" (UID: "7587133a-02a0-45a8-8753-e60338f66922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.857236 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7587133a-02a0-45a8-8753-e60338f66922" (UID: "7587133a-02a0-45a8-8753-e60338f66922"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.861345 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0bea7702-cf77-4c94-94a4-f5201e080ffd" (UID: "0bea7702-cf77-4c94-94a4-f5201e080ffd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.872142 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0bea7702-cf77-4c94-94a4-f5201e080ffd" (UID: "0bea7702-cf77-4c94-94a4-f5201e080ffd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.879357 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config" (OuterVolumeSpecName: "config") pod "0bea7702-cf77-4c94-94a4-f5201e080ffd" (UID: "0bea7702-cf77-4c94-94a4-f5201e080ffd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889498 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr4q7\" (UniqueName: \"kubernetes.io/projected/7587133a-02a0-45a8-8753-e60338f66922-kube-api-access-zr4q7\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889531 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4q8\" (UniqueName: \"kubernetes.io/projected/0bea7702-cf77-4c94-94a4-f5201e080ffd-kube-api-access-qv4q8\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889541 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889552 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889562 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889571 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889581 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889589 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bea7702-cf77-4c94-94a4-f5201e080ffd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889598 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:49 crc kubenswrapper[4759]: I1205 00:45:49.889608 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7587133a-02a0-45a8-8753-e60338f66922-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.143862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" event={"ID":"0bea7702-cf77-4c94-94a4-f5201e080ffd","Type":"ContainerDied","Data":"d7c89f4d0db1f67de5b2b1a40e45bd319101f3a48f4b6af3fd35646562e82e15"} Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.143908 4759 scope.go:117] "RemoveContainer" containerID="49668a4366b4882d6e53d415de0dbce34154a254b1fac6b5d042e0392ed33b93" Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.144023 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-bmk82" Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.152836 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" event={"ID":"7587133a-02a0-45a8-8753-e60338f66922","Type":"ContainerDied","Data":"e0b0bf7b2e08811d818b6b1c381a947b78d053b6de8e19f844405779d2cdcbdb"} Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.152905 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-4d648" Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.161827 4759 generic.go:334] "Generic (PLEG): container finished" podID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerID="da6a161f851242ffea24d247a269e9d11dae25a1465b018b5749bdecb109e157" exitCode=0 Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.163372 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerDied","Data":"da6a161f851242ffea24d247a269e9d11dae25a1465b018b5749bdecb109e157"} Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.217628 4759 scope.go:117] "RemoveContainer" containerID="bd2cf37a73e08c63291da35ea7004840c40ee65b96a24da1030e68b09e664c15" Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.377893 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.409824 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-4d648"] Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.448508 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:50 crc kubenswrapper[4759]: I1205 00:45:50.467956 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-bmk82"] Dec 05 00:45:51 crc kubenswrapper[4759]: I1205 00:45:51.172275 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bea7702-cf77-4c94-94a4-f5201e080ffd" path="/var/lib/kubelet/pods/0bea7702-cf77-4c94-94a4-f5201e080ffd/volumes" Dec 05 00:45:51 crc kubenswrapper[4759]: I1205 00:45:51.173525 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7587133a-02a0-45a8-8753-e60338f66922" path="/var/lib/kubelet/pods/7587133a-02a0-45a8-8753-e60338f66922/volumes" Dec 05 00:45:51 crc kubenswrapper[4759]: I1205 00:45:51.174130 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerStarted","Data":"c19eb91359e9fd35bb8be8867dfb760ad3292f31882224455f4beb9ecec628e6"} Dec 05 00:45:52 crc kubenswrapper[4759]: I1205 00:45:52.136759 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 05 00:45:52 crc kubenswrapper[4759]: I1205 00:45:52.192523 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:52 crc kubenswrapper[4759]: I1205 00:45:52.224706 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" podStartSLOduration=5.22468755 podStartE2EDuration="5.22468755s" podCreationTimestamp="2025-12-05 00:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:45:52.209079057 +0000 UTC m=+1371.424740017" watchObservedRunningTime="2025-12-05 00:45:52.22468755 +0000 UTC m=+1371.440348500" Dec 05 00:45:53 crc kubenswrapper[4759]: I1205 00:45:53.219002 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7be9242-6af1-4987-8d25-b31a4eaa7980" containerID="cec3ad6a10e6c07bb5dc6c263fe2871b09d752d55b9b4389d1b9211358ac3d53" exitCode=0 Dec 05 00:45:53 crc kubenswrapper[4759]: I1205 00:45:53.219087 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wdgc" event={"ID":"a7be9242-6af1-4987-8d25-b31a4eaa7980","Type":"ContainerDied","Data":"cec3ad6a10e6c07bb5dc6c263fe2871b09d752d55b9b4389d1b9211358ac3d53"} Dec 05 00:45:57 crc kubenswrapper[4759]: I1205 00:45:57.651645 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:45:57 crc kubenswrapper[4759]: I1205 00:45:57.747030 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:45:57 crc kubenswrapper[4759]: I1205 00:45:57.752251 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" containerID="cri-o://ef87ecc5174814c63cc0de21f385d1e30f8344eaebfd0ba1ca98d409f0497e20" gracePeriod=10 Dec 05 00:45:59 crc kubenswrapper[4759]: I1205 00:45:59.294445 4759 generic.go:334] "Generic (PLEG): container finished" podID="e332e191-2628-48d0-be38-886992343bc8" containerID="ef87ecc5174814c63cc0de21f385d1e30f8344eaebfd0ba1ca98d409f0497e20" exitCode=0 Dec 05 00:45:59 crc kubenswrapper[4759]: I1205 00:45:59.294877 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" event={"ID":"e332e191-2628-48d0-be38-886992343bc8","Type":"ContainerDied","Data":"ef87ecc5174814c63cc0de21f385d1e30f8344eaebfd0ba1ca98d409f0497e20"} Dec 05 00:46:00 crc kubenswrapper[4759]: I1205 00:46:00.061547 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Dec 05 00:46:02 crc kubenswrapper[4759]: I1205 00:46:02.136862 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 05 00:46:02 crc kubenswrapper[4759]: I1205 00:46:02.143192 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 05 00:46:02 crc kubenswrapper[4759]: I1205 00:46:02.335981 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 05 00:46:03 crc kubenswrapper[4759]: I1205 00:46:03.342101 4759 generic.go:334] "Generic (PLEG): container finished" podID="538f07d4-2a1a-47e9-aec3-161f7b23af6f" containerID="54384dd13cc9b46927d505d59e71f259dcb3239c9d1081840a9aba7e9c8ef0c2" exitCode=0 Dec 05 00:46:03 crc kubenswrapper[4759]: I1205 00:46:03.342170 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hl5mz" event={"ID":"538f07d4-2a1a-47e9-aec3-161f7b23af6f","Type":"ContainerDied","Data":"54384dd13cc9b46927d505d59e71f259dcb3239c9d1081840a9aba7e9c8ef0c2"} Dec 05 00:46:05 crc kubenswrapper[4759]: I1205 00:46:05.061542 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Dec 05 00:46:07 crc kubenswrapper[4759]: I1205 00:46:07.380970 4759 generic.go:334] "Generic (PLEG): container finished" podID="885cf08d-63c8-45da-ab2a-18b28a9b0f40" containerID="c2374cb010257bf2b734839ed8de394e9a0ea247c389ee1a58a00c206bf27279" exitCode=0 Dec 05 00:46:07 crc kubenswrapper[4759]: I1205 00:46:07.381059 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dx9np" event={"ID":"885cf08d-63c8-45da-ab2a-18b28a9b0f40","Type":"ContainerDied","Data":"c2374cb010257bf2b734839ed8de394e9a0ea247c389ee1a58a00c206bf27279"} Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.065815 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.066533 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.117854 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.120919 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hl5mz" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.225844 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.225949 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.225969 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226003 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226574 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spvn6\" (UniqueName: \"kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226620 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle\") pod \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226670 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rxbh\" (UniqueName: \"kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh\") pod \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226705 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data\") pod \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226732 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys\") pod \"a7be9242-6af1-4987-8d25-b31a4eaa7980\" (UID: \"a7be9242-6af1-4987-8d25-b31a4eaa7980\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.226795 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data\") pod \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\" (UID: \"538f07d4-2a1a-47e9-aec3-161f7b23af6f\") " Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.233425 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.233426 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "538f07d4-2a1a-47e9-aec3-161f7b23af6f" (UID: "538f07d4-2a1a-47e9-aec3-161f7b23af6f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.233964 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6" (OuterVolumeSpecName: "kube-api-access-spvn6") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "kube-api-access-spvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.234075 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh" (OuterVolumeSpecName: "kube-api-access-5rxbh") pod "538f07d4-2a1a-47e9-aec3-161f7b23af6f" (UID: "538f07d4-2a1a-47e9-aec3-161f7b23af6f"). InnerVolumeSpecName "kube-api-access-5rxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.234346 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts" (OuterVolumeSpecName: "scripts") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.235486 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.255129 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data" (OuterVolumeSpecName: "config-data") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.258635 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "538f07d4-2a1a-47e9-aec3-161f7b23af6f" (UID: "538f07d4-2a1a-47e9-aec3-161f7b23af6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.269451 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7be9242-6af1-4987-8d25-b31a4eaa7980" (UID: "a7be9242-6af1-4987-8d25-b31a4eaa7980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.279613 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data" (OuterVolumeSpecName: "config-data") pod "538f07d4-2a1a-47e9-aec3-161f7b23af6f" (UID: "538f07d4-2a1a-47e9-aec3-161f7b23af6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331109 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331180 4759 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331195 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331210 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331225 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spvn6\" (UniqueName: \"kubernetes.io/projected/a7be9242-6af1-4987-8d25-b31a4eaa7980-kube-api-access-spvn6\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331238 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331250 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rxbh\" (UniqueName: \"kubernetes.io/projected/538f07d4-2a1a-47e9-aec3-161f7b23af6f-kube-api-access-5rxbh\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331263 4759 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331274 4759 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7be9242-6af1-4987-8d25-b31a4eaa7980-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.331286 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/538f07d4-2a1a-47e9-aec3-161f7b23af6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.488375 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hl5mz" event={"ID":"538f07d4-2a1a-47e9-aec3-161f7b23af6f","Type":"ContainerDied","Data":"239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7"} Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.488415 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239cd63195d4ce714235f095a1bf4df6e04dcc4e0cef08e5faca361f74b7e8e7" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.488419 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hl5mz" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.490464 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7wdgc" event={"ID":"a7be9242-6af1-4987-8d25-b31a4eaa7980","Type":"ContainerDied","Data":"69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0"} Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.490503 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cf463158590f218804912579a17c037443d47e65f98c9c180a5b59c6e934c0" Dec 05 00:46:15 crc kubenswrapper[4759]: I1205 00:46:15.490566 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7wdgc" Dec 05 00:46:15 crc kubenswrapper[4759]: E1205 00:46:15.560522 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 05 00:46:15 crc kubenswrapper[4759]: E1205 00:46:15.560697 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ttwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-xbx7b_openstack(ed73e23b-4161-4968-93d0-aaabce1aa4bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:46:15 crc kubenswrapper[4759]: E1205 00:46:15.562873 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-xbx7b" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.180285 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.180510 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n557h697h5f9h5b5hfh669h7dh57bh9bh65fh86hbbh574h64fhc9h54h676h687h64dh5dbh6ch58dhb7h659h556h6fh5fch586h579h5h5dbh5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c26c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(9ba26284-ae89-4046-bc8c-acf49206704f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.239587 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7wdgc"] Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.259611 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7wdgc"] Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.293005 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.306475 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dx9np" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320053 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hkbq2"] Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320450 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="538f07d4-2a1a-47e9-aec3-161f7b23af6f" containerName="glance-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320466 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="538f07d4-2a1a-47e9-aec3-161f7b23af6f" containerName="glance-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320481 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bea7702-cf77-4c94-94a4-f5201e080ffd" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320488 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bea7702-cf77-4c94-94a4-f5201e080ffd" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320499 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320505 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320517 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7be9242-6af1-4987-8d25-b31a4eaa7980" containerName="keystone-bootstrap" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320523 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7be9242-6af1-4987-8d25-b31a4eaa7980" containerName="keystone-bootstrap" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320537 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320542 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320561 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885cf08d-63c8-45da-ab2a-18b28a9b0f40" containerName="neutron-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320567 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="885cf08d-63c8-45da-ab2a-18b28a9b0f40" containerName="neutron-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.320579 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7587133a-02a0-45a8-8753-e60338f66922" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320585 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7587133a-02a0-45a8-8753-e60338f66922" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320742 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bea7702-cf77-4c94-94a4-f5201e080ffd" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320753 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7587133a-02a0-45a8-8753-e60338f66922" containerName="init" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320767 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320777 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="885cf08d-63c8-45da-ab2a-18b28a9b0f40" containerName="neutron-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320790 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7be9242-6af1-4987-8d25-b31a4eaa7980" containerName="keystone-bootstrap" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.320804 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="538f07d4-2a1a-47e9-aec3-161f7b23af6f" containerName="glance-db-sync" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.321452 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.330283 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.330510 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.330644 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.330834 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.331107 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nl4ht" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.386958 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hkbq2"] Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455540 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc\") pod \"e332e191-2628-48d0-be38-886992343bc8\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455645 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config\") pod \"e332e191-2628-48d0-be38-886992343bc8\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455727 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4st5k\" (UniqueName: \"kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k\") pod \"e332e191-2628-48d0-be38-886992343bc8\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455765 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb\") pod \"e332e191-2628-48d0-be38-886992343bc8\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455864 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8h7m\" (UniqueName: \"kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m\") pod \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455888 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config\") pod \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455940 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb\") pod \"e332e191-2628-48d0-be38-886992343bc8\" (UID: \"e332e191-2628-48d0-be38-886992343bc8\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.455972 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle\") pod \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\" (UID: \"885cf08d-63c8-45da-ab2a-18b28a9b0f40\") " Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458142 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458196 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458245 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458332 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mwfl\" (UniqueName: \"kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458367 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.458495 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.487517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k" (OuterVolumeSpecName: "kube-api-access-4st5k") pod "e332e191-2628-48d0-be38-886992343bc8" (UID: "e332e191-2628-48d0-be38-886992343bc8"). InnerVolumeSpecName "kube-api-access-4st5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.490541 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m" (OuterVolumeSpecName: "kube-api-access-d8h7m") pod "885cf08d-63c8-45da-ab2a-18b28a9b0f40" (UID: "885cf08d-63c8-45da-ab2a-18b28a9b0f40"). InnerVolumeSpecName "kube-api-access-d8h7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.509919 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dx9np" event={"ID":"885cf08d-63c8-45da-ab2a-18b28a9b0f40","Type":"ContainerDied","Data":"1a7ac24f3f72b0df5301e66e750c85bea98624b1a30b7c09abb16ff5785000bd"} Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.509960 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7ac24f3f72b0df5301e66e750c85bea98624b1a30b7c09abb16ff5785000bd" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.510017 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dx9np" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.512596 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.512753 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" event={"ID":"e332e191-2628-48d0-be38-886992343bc8","Type":"ContainerDied","Data":"d866018c16a540a0fcc280210c22431c2d2f4b0b3a2e221a52cba049254479d5"} Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.512783 4759 scope.go:117] "RemoveContainer" containerID="ef87ecc5174814c63cc0de21f385d1e30f8344eaebfd0ba1ca98d409f0497e20" Dec 05 00:46:16 crc kubenswrapper[4759]: E1205 00:46:16.513830 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-xbx7b" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561043 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mwfl\" (UniqueName: \"kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561084 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561155 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561230 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561254 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561275 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561343 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4st5k\" (UniqueName: \"kubernetes.io/projected/e332e191-2628-48d0-be38-886992343bc8-kube-api-access-4st5k\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.561356 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8h7m\" (UniqueName: \"kubernetes.io/projected/885cf08d-63c8-45da-ab2a-18b28a9b0f40-kube-api-access-d8h7m\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.597186 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.626874 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.629007 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.631421 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.635835 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config" (OuterVolumeSpecName: "config") pod "885cf08d-63c8-45da-ab2a-18b28a9b0f40" (UID: "885cf08d-63c8-45da-ab2a-18b28a9b0f40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.638072 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.652050 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mwfl\" (UniqueName: \"kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl\") pod \"keystone-bootstrap-hkbq2\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.664068 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.693034 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config" (OuterVolumeSpecName: "config") pod "e332e191-2628-48d0-be38-886992343bc8" (UID: "e332e191-2628-48d0-be38-886992343bc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.695484 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "885cf08d-63c8-45da-ab2a-18b28a9b0f40" (UID: "885cf08d-63c8-45da-ab2a-18b28a9b0f40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.709842 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e332e191-2628-48d0-be38-886992343bc8" (UID: "e332e191-2628-48d0-be38-886992343bc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.722108 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.723737 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.732078 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776072 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776213 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsg5h\" (UniqueName: \"kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776342 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776419 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776467 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776522 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776702 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776714 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.776726 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/885cf08d-63c8-45da-ab2a-18b28a9b0f40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.826491 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e332e191-2628-48d0-be38-886992343bc8" (UID: "e332e191-2628-48d0-be38-886992343bc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.826535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e332e191-2628-48d0-be38-886992343bc8" (UID: "e332e191-2628-48d0-be38-886992343bc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.878550 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.878608 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.878644 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.878693 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.879371 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.879664 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.879833 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.879914 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.878741 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsg5h\" (UniqueName: \"kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.880852 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.880932 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.880954 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e332e191-2628-48d0-be38-886992343bc8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.881519 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.898910 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsg5h\" (UniqueName: \"kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h\") pod \"dnsmasq-dns-57c957c4ff-98jrp\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:16 crc kubenswrapper[4759]: I1205 00:46:16.940681 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.074964 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.141171 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.148777 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-hxz4t"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.170731 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7be9242-6af1-4987-8d25-b31a4eaa7980" path="/var/lib/kubelet/pods/a7be9242-6af1-4987-8d25-b31a4eaa7980/volumes" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.171322 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e332e191-2628-48d0-be38-886992343bc8" path="/var/lib/kubelet/pods/e332e191-2628-48d0-be38-886992343bc8/volumes" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.499753 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.541290 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.562192 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.562324 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.593918 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.595525 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.604505 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.604914 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-299vg" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.604512 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.610198 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.619151 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697278 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txthv\" (UniqueName: \"kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697348 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697425 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697487 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697507 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697526 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697561 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697592 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255lk\" (UniqueName: \"kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697613 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.697634 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799272 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txthv\" (UniqueName: \"kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799348 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799375 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799460 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799481 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799501 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799561 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799599 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255lk\" (UniqueName: \"kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799622 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.799647 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.800167 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.800519 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.800802 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.800994 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.804675 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.816863 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.817620 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255lk\" (UniqueName: \"kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.820737 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.821216 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.821552 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txthv\" (UniqueName: \"kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv\") pod \"dnsmasq-dns-5ccc5c4795-9wz2q\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.830102 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs\") pod \"neutron-6bd7d6d9bd-hpq49\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.893374 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:17 crc kubenswrapper[4759]: I1205 00:46:17.923038 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:18 crc kubenswrapper[4759]: I1205 00:46:18.481334 4759 scope.go:117] "RemoveContainer" containerID="8c3bda245e3836d5b2462b42288f424a4f366c08d81f617136fd39a29456ee34" Dec 05 00:46:18 crc kubenswrapper[4759]: E1205 00:46:18.494131 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 05 00:46:18 crc kubenswrapper[4759]: E1205 00:46:18.494375 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r7vnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wdw54_openstack(8fe2c3db-f452-4009-abca-b9ee975ad38d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:46:18 crc kubenswrapper[4759]: E1205 00:46:18.496360 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wdw54" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" Dec 05 00:46:18 crc kubenswrapper[4759]: E1205 00:46:18.574374 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wdw54" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.180557 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.234219 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.246880 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hkbq2"] Dec 05 00:46:19 crc kubenswrapper[4759]: W1205 00:46:19.247266 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a0fae4c_619d_4c23_903e_127b57fc5211.slice/crio-1c51f9c29b434fb0580731cf9f250bcf1341babc700a92f0d611aa4c7dc701c5 WatchSource:0}: Error finding container 1c51f9c29b434fb0580731cf9f250bcf1341babc700a92f0d611aa4c7dc701c5: Status 404 returned error can't find the container with id 1c51f9c29b434fb0580731cf9f250bcf1341babc700a92f0d611aa4c7dc701c5 Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.272491 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:19 crc kubenswrapper[4759]: W1205 00:46:19.282627 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ccfa87_d7ed_4bdd_a93d_a8fdd24f0372.slice/crio-abf84cec919fc214c54244eafdeb985c7811374d42bfdb566351ea889e690848 WatchSource:0}: Error finding container abf84cec919fc214c54244eafdeb985c7811374d42bfdb566351ea889e690848: Status 404 returned error can't find the container with id abf84cec919fc214c54244eafdeb985c7811374d42bfdb566351ea889e690848 Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.587346 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hkbq2" event={"ID":"9e5f8952-6ad7-472f-b295-b426f1404270","Type":"ContainerStarted","Data":"dc4d9b28b4fb076966c0c353c9b60133f46742a73a6f8997bc979ccb3230050e"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.587630 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hkbq2" event={"ID":"9e5f8952-6ad7-472f-b295-b426f1404270","Type":"ContainerStarted","Data":"48cf60be5250dc279460f0246d870ac581c97f57478acc003de49100ea99c6fc"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.600965 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" podUID="4a0fae4c-619d-4c23-903e-127b57fc5211" containerName="init" containerID="cri-o://a0cef4beecf01ba853be9ca99d1c1da65c718f3048d1cf4614d3124b90840783" gracePeriod=10 Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.600997 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" event={"ID":"4a0fae4c-619d-4c23-903e-127b57fc5211","Type":"ContainerStarted","Data":"a0cef4beecf01ba853be9ca99d1c1da65c718f3048d1cf4614d3124b90840783"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.601052 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" event={"ID":"4a0fae4c-619d-4c23-903e-127b57fc5211","Type":"ContainerStarted","Data":"1c51f9c29b434fb0580731cf9f250bcf1341babc700a92f0d611aa4c7dc701c5"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.611314 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s5c5z" event={"ID":"8bcbbf13-82ed-4c4a-8694-d3149d730cb0","Type":"ContainerStarted","Data":"f85a35ddba335d95eaf0401b53d370c3576a48693a5629c7b064127025b99058"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.617697 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hkbq2" podStartSLOduration=3.617672876 podStartE2EDuration="3.617672876s" podCreationTimestamp="2025-12-05 00:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:19.603775956 +0000 UTC m=+1398.819436906" watchObservedRunningTime="2025-12-05 00:46:19.617672876 +0000 UTC m=+1398.833333826" Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.622430 4759 generic.go:334] "Generic (PLEG): container finished" podID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerID="5f43acf4f73432f27279edfab1df1e9454cdf47e187484e46d03991c661436aa" exitCode=0 Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.622524 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" event={"ID":"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792","Type":"ContainerDied","Data":"5f43acf4f73432f27279edfab1df1e9454cdf47e187484e46d03991c661436aa"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.622555 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" event={"ID":"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792","Type":"ContainerStarted","Data":"aa83f15772ea2819891ac11e771fc71e852bd675ad6b6119202766f4c78f3eb9"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.624753 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerStarted","Data":"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.624786 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerStarted","Data":"abf84cec919fc214c54244eafdeb985c7811374d42bfdb566351ea889e690848"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.626193 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vvtw7" event={"ID":"419ed25e-6ca1-4ca7-978e-7b1464982278","Type":"ContainerStarted","Data":"9c4a66b1bac8326805e90951a7af9a0adb1cf7a0045b18833bd1e9f4dafc6d82"} Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.693397 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vvtw7" podStartSLOduration=3.223091938 podStartE2EDuration="33.693379066s" podCreationTimestamp="2025-12-05 00:45:46 +0000 UTC" firstStartedPulling="2025-12-05 00:45:47.995866598 +0000 UTC m=+1367.211527548" lastFinishedPulling="2025-12-05 00:46:18.466153726 +0000 UTC m=+1397.681814676" observedRunningTime="2025-12-05 00:46:19.690985087 +0000 UTC m=+1398.906646037" watchObservedRunningTime="2025-12-05 00:46:19.693379066 +0000 UTC m=+1398.909040016" Dec 05 00:46:19 crc kubenswrapper[4759]: I1205 00:46:19.710592 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s5c5z" podStartSLOduration=3.254728525 podStartE2EDuration="33.710573708s" podCreationTimestamp="2025-12-05 00:45:46 +0000 UTC" firstStartedPulling="2025-12-05 00:45:48.006811337 +0000 UTC m=+1367.222472277" lastFinishedPulling="2025-12-05 00:46:18.46265651 +0000 UTC m=+1397.678317460" observedRunningTime="2025-12-05 00:46:19.707527933 +0000 UTC m=+1398.923188883" watchObservedRunningTime="2025-12-05 00:46:19.710573708 +0000 UTC m=+1398.926234658" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.066501 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-hxz4t" podUID="e332e191-2628-48d0-be38-886992343bc8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.636057 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerStarted","Data":"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30"} Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.636467 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.637782 4759 generic.go:334] "Generic (PLEG): container finished" podID="4a0fae4c-619d-4c23-903e-127b57fc5211" containerID="a0cef4beecf01ba853be9ca99d1c1da65c718f3048d1cf4614d3124b90840783" exitCode=0 Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.637824 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" event={"ID":"4a0fae4c-619d-4c23-903e-127b57fc5211","Type":"ContainerDied","Data":"a0cef4beecf01ba853be9ca99d1c1da65c718f3048d1cf4614d3124b90840783"} Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.641073 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" event={"ID":"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792","Type":"ContainerStarted","Data":"444351a4492c3a62f57421dbcaccd69251cb813a90111f81776d50d40d0577e7"} Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.641099 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.658245 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bd7d6d9bd-hpq49" podStartSLOduration=3.658225494 podStartE2EDuration="3.658225494s" podCreationTimestamp="2025-12-05 00:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:20.655699472 +0000 UTC m=+1399.871360442" watchObservedRunningTime="2025-12-05 00:46:20.658225494 +0000 UTC m=+1399.873886444" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.688541 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" podStartSLOduration=3.688520987 podStartE2EDuration="3.688520987s" podCreationTimestamp="2025-12-05 00:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:20.6825186 +0000 UTC m=+1399.898179550" watchObservedRunningTime="2025-12-05 00:46:20.688520987 +0000 UTC m=+1399.904181937" Dec 05 00:46:20 crc kubenswrapper[4759]: I1205 00:46:20.992165 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079184 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079231 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079291 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079327 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsg5h\" (UniqueName: \"kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079404 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.079482 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb\") pod \"4a0fae4c-619d-4c23-903e-127b57fc5211\" (UID: \"4a0fae4c-619d-4c23-903e-127b57fc5211\") " Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.109671 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h" (OuterVolumeSpecName: "kube-api-access-gsg5h") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "kube-api-access-gsg5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.115182 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.128750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.135926 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.137795 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.138709 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config" (OuterVolumeSpecName: "config") pod "4a0fae4c-619d-4c23-903e-127b57fc5211" (UID: "4a0fae4c-619d-4c23-903e-127b57fc5211"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182199 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182365 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182462 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182538 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182611 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a0fae4c-619d-4c23-903e-127b57fc5211-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.182695 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsg5h\" (UniqueName: \"kubernetes.io/projected/4a0fae4c-619d-4c23-903e-127b57fc5211-kube-api-access-gsg5h\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.650836 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.651270 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-98jrp" event={"ID":"4a0fae4c-619d-4c23-903e-127b57fc5211","Type":"ContainerDied","Data":"1c51f9c29b434fb0580731cf9f250bcf1341babc700a92f0d611aa4c7dc701c5"} Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.651299 4759 scope.go:117] "RemoveContainer" containerID="a0cef4beecf01ba853be9ca99d1c1da65c718f3048d1cf4614d3124b90840783" Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.707363 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:21 crc kubenswrapper[4759]: I1205 00:46:21.717119 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-98jrp"] Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.669679 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerStarted","Data":"c70695da30aaa03429fa5b6dfb1e3bdf0e3be0a4745c0538e1f7b2f35b7181b6"} Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.684766 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7677b6f8d5-zwkn7"] Dec 05 00:46:22 crc kubenswrapper[4759]: E1205 00:46:22.685186 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0fae4c-619d-4c23-903e-127b57fc5211" containerName="init" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.685203 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0fae4c-619d-4c23-903e-127b57fc5211" containerName="init" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.685487 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0fae4c-619d-4c23-903e-127b57fc5211" containerName="init" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.687519 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.689842 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.689925 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.702509 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7677b6f8d5-zwkn7"] Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811492 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-combined-ca-bundle\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811578 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-httpd-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811608 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-ovndb-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811630 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnph\" (UniqueName: \"kubernetes.io/projected/b885b03c-f613-4c09-9ec3-8492c335923a-kube-api-access-5jnph\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811688 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-internal-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811718 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.811760 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-public-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913688 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-httpd-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913737 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-ovndb-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913769 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnph\" (UniqueName: \"kubernetes.io/projected/b885b03c-f613-4c09-9ec3-8492c335923a-kube-api-access-5jnph\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913796 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-internal-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913832 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913883 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-public-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.913968 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-combined-ca-bundle\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.921454 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-public-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.923840 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-internal-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.928505 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-ovndb-tls-certs\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.930799 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-httpd-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.936331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-config\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.938146 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnph\" (UniqueName: \"kubernetes.io/projected/b885b03c-f613-4c09-9ec3-8492c335923a-kube-api-access-5jnph\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:22 crc kubenswrapper[4759]: I1205 00:46:22.938201 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b885b03c-f613-4c09-9ec3-8492c335923a-combined-ca-bundle\") pod \"neutron-7677b6f8d5-zwkn7\" (UID: \"b885b03c-f613-4c09-9ec3-8492c335923a\") " pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.013756 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.176130 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0fae4c-619d-4c23-903e-127b57fc5211" path="/var/lib/kubelet/pods/4a0fae4c-619d-4c23-903e-127b57fc5211/volumes" Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.703827 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7677b6f8d5-zwkn7"] Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.773575 4759 generic.go:334] "Generic (PLEG): container finished" podID="9e5f8952-6ad7-472f-b295-b426f1404270" containerID="dc4d9b28b4fb076966c0c353c9b60133f46742a73a6f8997bc979ccb3230050e" exitCode=0 Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.774036 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hkbq2" event={"ID":"9e5f8952-6ad7-472f-b295-b426f1404270","Type":"ContainerDied","Data":"dc4d9b28b4fb076966c0c353c9b60133f46742a73a6f8997bc979ccb3230050e"} Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.787648 4759 generic.go:334] "Generic (PLEG): container finished" podID="8bcbbf13-82ed-4c4a-8694-d3149d730cb0" containerID="f85a35ddba335d95eaf0401b53d370c3576a48693a5629c7b064127025b99058" exitCode=0 Dec 05 00:46:23 crc kubenswrapper[4759]: I1205 00:46:23.787691 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s5c5z" event={"ID":"8bcbbf13-82ed-4c4a-8694-d3149d730cb0","Type":"ContainerDied","Data":"f85a35ddba335d95eaf0401b53d370c3576a48693a5629c7b064127025b99058"} Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.840135 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7677b6f8d5-zwkn7" event={"ID":"b885b03c-f613-4c09-9ec3-8492c335923a","Type":"ContainerStarted","Data":"ebbb04fd524782b7f8512546dea0d742870ebf66766a2e17fa43aacaab328259"} Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.840544 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7677b6f8d5-zwkn7" event={"ID":"b885b03c-f613-4c09-9ec3-8492c335923a","Type":"ContainerStarted","Data":"789338782ffbfd88aa915226d5b9e5dd8633399b475826a4f8a663fe947f2034"} Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.840570 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.840585 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7677b6f8d5-zwkn7" event={"ID":"b885b03c-f613-4c09-9ec3-8492c335923a","Type":"ContainerStarted","Data":"dfe56bf07062c29a23debc8c360c29a9e448029c8cea61bf32d224724b67623e"} Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.842460 4759 generic.go:334] "Generic (PLEG): container finished" podID="419ed25e-6ca1-4ca7-978e-7b1464982278" containerID="9c4a66b1bac8326805e90951a7af9a0adb1cf7a0045b18833bd1e9f4dafc6d82" exitCode=0 Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.842509 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vvtw7" event={"ID":"419ed25e-6ca1-4ca7-978e-7b1464982278","Type":"ContainerDied","Data":"9c4a66b1bac8326805e90951a7af9a0adb1cf7a0045b18833bd1e9f4dafc6d82"} Dec 05 00:46:24 crc kubenswrapper[4759]: I1205 00:46:24.870807 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7677b6f8d5-zwkn7" podStartSLOduration=2.870786116 podStartE2EDuration="2.870786116s" podCreationTimestamp="2025-12-05 00:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:24.86606269 +0000 UTC m=+1404.081723640" watchObservedRunningTime="2025-12-05 00:46:24.870786116 +0000 UTC m=+1404.086447066" Dec 05 00:46:27 crc kubenswrapper[4759]: I1205 00:46:27.896523 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:27 crc kubenswrapper[4759]: I1205 00:46:27.966716 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:46:27 crc kubenswrapper[4759]: I1205 00:46:27.966942 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="dnsmasq-dns" containerID="cri-o://c19eb91359e9fd35bb8be8867dfb760ad3292f31882224455f4beb9ecec628e6" gracePeriod=10 Dec 05 00:46:29 crc kubenswrapper[4759]: I1205 00:46:29.906290 4759 generic.go:334] "Generic (PLEG): container finished" podID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerID="c19eb91359e9fd35bb8be8867dfb760ad3292f31882224455f4beb9ecec628e6" exitCode=0 Dec 05 00:46:29 crc kubenswrapper[4759]: I1205 00:46:29.906407 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerDied","Data":"c19eb91359e9fd35bb8be8867dfb760ad3292f31882224455f4beb9ecec628e6"} Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.829228 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s5c5z" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.929582 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s5c5z" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.929671 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s5c5z" event={"ID":"8bcbbf13-82ed-4c4a-8694-d3149d730cb0","Type":"ContainerDied","Data":"2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28"} Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.930453 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3e505da216d1dd1450ab1579769f7d8e501695d02fba9483e63aaf9717dd28" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.932223 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vvtw7" event={"ID":"419ed25e-6ca1-4ca7-978e-7b1464982278","Type":"ContainerDied","Data":"4a7604134582f4690bdc51ccc90dee03028b0e968411de39a0599a9ac942f6c1"} Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.932250 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7604134582f4690bdc51ccc90dee03028b0e968411de39a0599a9ac942f6c1" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.937553 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hkbq2" event={"ID":"9e5f8952-6ad7-472f-b295-b426f1404270","Type":"ContainerDied","Data":"48cf60be5250dc279460f0246d870ac581c97f57478acc003de49100ea99c6fc"} Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.937600 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cf60be5250dc279460f0246d870ac581c97f57478acc003de49100ea99c6fc" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.984111 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995003 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs\") pod \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995116 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tn5x\" (UniqueName: \"kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x\") pod \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995208 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data\") pod \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995425 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts\") pod \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995540 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle\") pod \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\" (UID: \"8bcbbf13-82ed-4c4a-8694-d3149d730cb0\") " Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.995731 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs" (OuterVolumeSpecName: "logs") pod "8bcbbf13-82ed-4c4a-8694-d3149d730cb0" (UID: "8bcbbf13-82ed-4c4a-8694-d3149d730cb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:46:31 crc kubenswrapper[4759]: I1205 00:46:31.996005 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.002978 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts" (OuterVolumeSpecName: "scripts") pod "8bcbbf13-82ed-4c4a-8694-d3149d730cb0" (UID: "8bcbbf13-82ed-4c4a-8694-d3149d730cb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.017523 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x" (OuterVolumeSpecName: "kube-api-access-4tn5x") pod "8bcbbf13-82ed-4c4a-8694-d3149d730cb0" (UID: "8bcbbf13-82ed-4c4a-8694-d3149d730cb0"). InnerVolumeSpecName "kube-api-access-4tn5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.022451 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.062269 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data" (OuterVolumeSpecName: "config-data") pod "8bcbbf13-82ed-4c4a-8694-d3149d730cb0" (UID: "8bcbbf13-82ed-4c4a-8694-d3149d730cb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.074563 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bcbbf13-82ed-4c4a-8694-d3149d730cb0" (UID: "8bcbbf13-82ed-4c4a-8694-d3149d730cb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.094206 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.096969 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.097086 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.097188 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mwfl\" (UniqueName: \"kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.097349 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.097449 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.097541 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data\") pod \"9e5f8952-6ad7-472f-b295-b426f1404270\" (UID: \"9e5f8952-6ad7-472f-b295-b426f1404270\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.098000 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tn5x\" (UniqueName: \"kubernetes.io/projected/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-kube-api-access-4tn5x\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.098069 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.098121 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.098178 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bcbbf13-82ed-4c4a-8694-d3149d730cb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.107649 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.107701 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.108440 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts" (OuterVolumeSpecName: "scripts") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.108989 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl" (OuterVolumeSpecName: "kube-api-access-7mwfl") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "kube-api-access-7mwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.154184 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.178432 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data" (OuterVolumeSpecName: "config-data") pod "9e5f8952-6ad7-472f-b295-b426f1404270" (UID: "9e5f8952-6ad7-472f-b295-b426f1404270"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199386 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199430 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199463 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ln7d\" (UniqueName: \"kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199488 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfz4x\" (UniqueName: \"kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x\") pod \"419ed25e-6ca1-4ca7-978e-7b1464982278\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199510 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199572 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199609 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc\") pod \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\" (UID: \"a321a07e-cbc8-41ae-82db-0d2cbab7e333\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199631 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data\") pod \"419ed25e-6ca1-4ca7-978e-7b1464982278\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.199662 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle\") pod \"419ed25e-6ca1-4ca7-978e-7b1464982278\" (UID: \"419ed25e-6ca1-4ca7-978e-7b1464982278\") " Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200027 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200044 4759 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200053 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mwfl\" (UniqueName: \"kubernetes.io/projected/9e5f8952-6ad7-472f-b295-b426f1404270-kube-api-access-7mwfl\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200063 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200071 4759 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.200081 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5f8952-6ad7-472f-b295-b426f1404270-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.215655 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x" (OuterVolumeSpecName: "kube-api-access-cfz4x") pod "419ed25e-6ca1-4ca7-978e-7b1464982278" (UID: "419ed25e-6ca1-4ca7-978e-7b1464982278"). InnerVolumeSpecName "kube-api-access-cfz4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.215831 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "419ed25e-6ca1-4ca7-978e-7b1464982278" (UID: "419ed25e-6ca1-4ca7-978e-7b1464982278"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.222687 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d" (OuterVolumeSpecName: "kube-api-access-9ln7d") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "kube-api-access-9ln7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.248218 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.248356 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.252644 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config" (OuterVolumeSpecName: "config") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.252786 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.258202 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a321a07e-cbc8-41ae-82db-0d2cbab7e333" (UID: "a321a07e-cbc8-41ae-82db-0d2cbab7e333"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.265131 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "419ed25e-6ca1-4ca7-978e-7b1464982278" (UID: "419ed25e-6ca1-4ca7-978e-7b1464982278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301536 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301566 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301577 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ln7d\" (UniqueName: \"kubernetes.io/projected/a321a07e-cbc8-41ae-82db-0d2cbab7e333-kube-api-access-9ln7d\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301586 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfz4x\" (UniqueName: \"kubernetes.io/projected/419ed25e-6ca1-4ca7-978e-7b1464982278-kube-api-access-cfz4x\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301595 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301604 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301612 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a321a07e-cbc8-41ae-82db-0d2cbab7e333-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301621 4759 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.301629 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ed25e-6ca1-4ca7-978e-7b1464982278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.951082 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerStarted","Data":"0175c3717dc074762ceb841920ecb3659b8e71c7df9ee3746ec3991c29e0a2e2"} Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.952798 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xbx7b" event={"ID":"ed73e23b-4161-4968-93d0-aaabce1aa4bb","Type":"ContainerStarted","Data":"f0080ba3f01b33a6e53a87494e4ada708db52a9e16f2c01af9cbf29c650c23b1"} Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.954562 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wdw54" event={"ID":"8fe2c3db-f452-4009-abca-b9ee975ad38d","Type":"ContainerStarted","Data":"598b58270ecae4c3e21a1f7bb2001bba2b3512ef5d43292a7ed347f7b30bd769"} Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.956829 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" event={"ID":"a321a07e-cbc8-41ae-82db-0d2cbab7e333","Type":"ContainerDied","Data":"3b5652f00503b8383b78a1195ff3a0a40f44ec942c6dbd1d6f223e4806465ff8"} Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.956903 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-4v2kj" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.956915 4759 scope.go:117] "RemoveContainer" containerID="c19eb91359e9fd35bb8be8867dfb760ad3292f31882224455f4beb9ecec628e6" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.956930 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hkbq2" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.956852 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vvtw7" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.969748 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bb6fdf748-gpknz"] Dec 05 00:46:32 crc kubenswrapper[4759]: E1205 00:46:32.970127 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="init" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970145 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="init" Dec 05 00:46:32 crc kubenswrapper[4759]: E1205 00:46:32.970159 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="419ed25e-6ca1-4ca7-978e-7b1464982278" containerName="barbican-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970165 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="419ed25e-6ca1-4ca7-978e-7b1464982278" containerName="barbican-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: E1205 00:46:32.970179 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="dnsmasq-dns" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970186 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="dnsmasq-dns" Dec 05 00:46:32 crc kubenswrapper[4759]: E1205 00:46:32.970197 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f8952-6ad7-472f-b295-b426f1404270" containerName="keystone-bootstrap" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970205 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f8952-6ad7-472f-b295-b426f1404270" containerName="keystone-bootstrap" Dec 05 00:46:32 crc kubenswrapper[4759]: E1205 00:46:32.970221 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcbbf13-82ed-4c4a-8694-d3149d730cb0" containerName="placement-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970226 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcbbf13-82ed-4c4a-8694-d3149d730cb0" containerName="placement-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970404 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" containerName="dnsmasq-dns" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970429 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcbbf13-82ed-4c4a-8694-d3149d730cb0" containerName="placement-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970444 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="419ed25e-6ca1-4ca7-978e-7b1464982278" containerName="barbican-db-sync" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.970455 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5f8952-6ad7-472f-b295-b426f1404270" containerName="keystone-bootstrap" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.971432 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.979437 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.979856 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.980078 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.980300 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.980501 4759 scope.go:117] "RemoveContainer" containerID="da6a161f851242ffea24d247a269e9d11dae25a1465b018b5749bdecb109e157" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.980513 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dslrx" Dec 05 00:46:32 crc kubenswrapper[4759]: I1205 00:46:32.993974 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bb6fdf748-gpknz"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.003101 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-xbx7b" podStartSLOduration=3.193689995 podStartE2EDuration="47.003087632s" podCreationTimestamp="2025-12-05 00:45:46 +0000 UTC" firstStartedPulling="2025-12-05 00:45:48.008540789 +0000 UTC m=+1367.224201739" lastFinishedPulling="2025-12-05 00:46:31.817938386 +0000 UTC m=+1411.033599376" observedRunningTime="2025-12-05 00:46:32.988830883 +0000 UTC m=+1412.204491843" watchObservedRunningTime="2025-12-05 00:46:33.003087632 +0000 UTC m=+1412.218748582" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.106918 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wdw54" podStartSLOduration=3.304150188 podStartE2EDuration="47.106897962s" podCreationTimestamp="2025-12-05 00:45:46 +0000 UTC" firstStartedPulling="2025-12-05 00:45:48.020726767 +0000 UTC m=+1367.236387718" lastFinishedPulling="2025-12-05 00:46:31.823474522 +0000 UTC m=+1411.039135492" observedRunningTime="2025-12-05 00:46:33.07264811 +0000 UTC m=+1412.288309060" watchObservedRunningTime="2025-12-05 00:46:33.106897962 +0000 UTC m=+1412.322558912" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.119915 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfwg\" (UniqueName: \"kubernetes.io/projected/a2ebc8a7-dfee-4768-a3c2-976932027197-kube-api-access-fpfwg\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.119989 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-combined-ca-bundle\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.120076 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-public-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.120099 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ebc8a7-dfee-4768-a3c2-976932027197-logs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.120123 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-config-data\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.120187 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-internal-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.120208 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-scripts\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.130064 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.145226 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-4v2kj"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.157352 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b7c787945-xkhnb"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.158836 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166429 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166570 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nl4ht" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166668 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166687 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166575 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.166781 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.207913 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a321a07e-cbc8-41ae-82db-0d2cbab7e333" path="/var/lib/kubelet/pods/a321a07e-cbc8-41ae-82db-0d2cbab7e333/volumes" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.208765 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b7c787945-xkhnb"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239634 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-public-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239706 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-fernet-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-internal-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239775 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-scripts\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239822 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-scripts\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239839 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-credential-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239868 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-config-data\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239912 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfwg\" (UniqueName: \"kubernetes.io/projected/a2ebc8a7-dfee-4768-a3c2-976932027197-kube-api-access-fpfwg\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239968 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm29g\" (UniqueName: \"kubernetes.io/projected/b37f0510-4911-4842-866a-863c4ac7e7c9-kube-api-access-tm29g\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.239992 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-combined-ca-bundle\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.240066 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-public-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.240093 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ebc8a7-dfee-4768-a3c2-976932027197-logs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.240109 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-combined-ca-bundle\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.240123 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-internal-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.240150 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-config-data\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.247615 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ebc8a7-dfee-4768-a3c2-976932027197-logs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.263382 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-internal-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.271733 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-scripts\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.271799 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.273496 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.274846 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-public-tls-certs\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.275937 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-combined-ca-bundle\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.275956 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ebc8a7-dfee-4768-a3c2-976932027197-config-data\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.278083 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.278356 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k5k95" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.278476 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.303694 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfwg\" (UniqueName: \"kubernetes.io/projected/a2ebc8a7-dfee-4768-a3c2-976932027197-kube-api-access-fpfwg\") pod \"placement-5bb6fdf748-gpknz\" (UID: \"a2ebc8a7-dfee-4768-a3c2-976932027197\") " pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.306279 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-77bd8fcb75-d6pnc"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.307911 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.310361 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343403 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-internal-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343441 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-combined-ca-bundle\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343477 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-public-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343498 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-fernet-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-scripts\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343554 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-credential-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343580 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-config-data\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.343631 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm29g\" (UniqueName: \"kubernetes.io/projected/b37f0510-4911-4842-866a-863c4ac7e7c9-kube-api-access-tm29g\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.357765 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.369019 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-combined-ca-bundle\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.371031 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-fernet-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.371245 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-scripts\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.371625 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-public-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.371982 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-credential-keys\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.372906 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-config-data\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.379235 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b37f0510-4911-4842-866a-863c4ac7e7c9-internal-tls-certs\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.385870 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.412762 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77bd8fcb75-d6pnc"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.450773 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data-custom\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.450863 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.450917 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6dx\" (UniqueName: \"kubernetes.io/projected/0e7388a6-d295-4807-8ce7-1eeb7dc55707-kube-api-access-5t6dx\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.450945 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-combined-ca-bundle\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.450997 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e7388a6-d295-4807-8ce7-1eeb7dc55707-logs\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.451088 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9502dfee-cb5d-44de-a549-4f0060d29d9b-logs\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.451172 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.451258 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data-custom\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.451340 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnrp\" (UniqueName: \"kubernetes.io/projected/9502dfee-cb5d-44de-a549-4f0060d29d9b-kube-api-access-5vnrp\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.451397 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.513895 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm29g\" (UniqueName: \"kubernetes.io/projected/b37f0510-4911-4842-866a-863c4ac7e7c9-kube-api-access-tm29g\") pod \"keystone-b7c787945-xkhnb\" (UID: \"b37f0510-4911-4842-866a-863c4ac7e7c9\") " pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.527678 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.553456 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.555264 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556486 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6dx\" (UniqueName: \"kubernetes.io/projected/0e7388a6-d295-4807-8ce7-1eeb7dc55707-kube-api-access-5t6dx\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556539 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-combined-ca-bundle\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556568 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e7388a6-d295-4807-8ce7-1eeb7dc55707-logs\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556628 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9502dfee-cb5d-44de-a549-4f0060d29d9b-logs\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556679 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556730 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data-custom\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnrp\" (UniqueName: \"kubernetes.io/projected/9502dfee-cb5d-44de-a549-4f0060d29d9b-kube-api-access-5vnrp\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556771 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.556793 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data-custom\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.557729 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9502dfee-cb5d-44de-a549-4f0060d29d9b-logs\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.561112 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data-custom\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.561834 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data-custom\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.562076 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e7388a6-d295-4807-8ce7-1eeb7dc55707-logs\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.569804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.571501 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7388a6-d295-4807-8ce7-1eeb7dc55707-config-data\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.577021 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-config-data\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.595967 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9502dfee-cb5d-44de-a549-4f0060d29d9b-combined-ca-bundle\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.600287 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnrp\" (UniqueName: \"kubernetes.io/projected/9502dfee-cb5d-44de-a549-4f0060d29d9b-kube-api-access-5vnrp\") pod \"barbican-worker-77bd8fcb75-d6pnc\" (UID: \"9502dfee-cb5d-44de-a549-4f0060d29d9b\") " pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.622848 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6dx\" (UniqueName: \"kubernetes.io/projected/0e7388a6-d295-4807-8ce7-1eeb7dc55707-kube-api-access-5t6dx\") pod \"barbican-keystone-listener-7d8dd9dc58-8n9k2\" (UID: \"0e7388a6-d295-4807-8ce7-1eeb7dc55707\") " pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.642369 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658261 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658346 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpc2g\" (UniqueName: \"kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658465 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658492 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.658543 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.681924 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.689469 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.697081 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.755499 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760646 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760722 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760752 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760804 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760827 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.760869 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpc2g\" (UniqueName: \"kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.762060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.762587 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.763098 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.763766 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.764062 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.778904 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.814171 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpc2g\" (UniqueName: \"kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g\") pod \"dnsmasq-dns-688c87cc99-4fzp8\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.825577 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.852996 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.863689 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.864074 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lv6\" (UniqueName: \"kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.869541 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.870923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.871117 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.872383 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.992475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.993696 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.993856 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.993915 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lv6\" (UniqueName: \"kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.994223 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:33 crc kubenswrapper[4759]: I1205 00:46:33.995085 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.018393 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.020936 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.021331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lv6\" (UniqueName: \"kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.055622 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle\") pod \"barbican-api-5b7dd4f9bb-wh6p9\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.070276 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.202014 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bb6fdf748-gpknz"] Dec 05 00:46:34 crc kubenswrapper[4759]: W1205 00:46:34.219108 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ebc8a7_dfee_4768_a3c2_976932027197.slice/crio-48ff14b1ba84fc6415e145cb474d084198e13a444dce26bcee6a8e0d1cfcb6c5 WatchSource:0}: Error finding container 48ff14b1ba84fc6415e145cb474d084198e13a444dce26bcee6a8e0d1cfcb6c5: Status 404 returned error can't find the container with id 48ff14b1ba84fc6415e145cb474d084198e13a444dce26bcee6a8e0d1cfcb6c5 Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.496107 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b7c787945-xkhnb"] Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.643155 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2"] Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.677435 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:34 crc kubenswrapper[4759]: W1205 00:46:34.689466 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7388a6_d295_4807_8ce7_1eeb7dc55707.slice/crio-909f509efed13e12f9e50c3136625403402c2cc5436bd15961e05928b2bedd9e WatchSource:0}: Error finding container 909f509efed13e12f9e50c3136625403402c2cc5436bd15961e05928b2bedd9e: Status 404 returned error can't find the container with id 909f509efed13e12f9e50c3136625403402c2cc5436bd15961e05928b2bedd9e Dec 05 00:46:34 crc kubenswrapper[4759]: W1205 00:46:34.691031 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb699fb_b45e_408a_a77c_91f48c5b9e08.slice/crio-312d61be45e0b91bc5689e18f03be8437a3cbfb949daa74494662ea6eb95e24e WatchSource:0}: Error finding container 312d61be45e0b91bc5689e18f03be8437a3cbfb949daa74494662ea6eb95e24e: Status 404 returned error can't find the container with id 312d61be45e0b91bc5689e18f03be8437a3cbfb949daa74494662ea6eb95e24e Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.832167 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77bd8fcb75-d6pnc"] Dec 05 00:46:34 crc kubenswrapper[4759]: I1205 00:46:34.842820 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.023747 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" event={"ID":"cfb699fb-b45e-408a-a77c-91f48c5b9e08","Type":"ContainerStarted","Data":"312d61be45e0b91bc5689e18f03be8437a3cbfb949daa74494662ea6eb95e24e"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.025364 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" event={"ID":"0e7388a6-d295-4807-8ce7-1eeb7dc55707","Type":"ContainerStarted","Data":"909f509efed13e12f9e50c3136625403402c2cc5436bd15961e05928b2bedd9e"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.026664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bb6fdf748-gpknz" event={"ID":"a2ebc8a7-dfee-4768-a3c2-976932027197","Type":"ContainerStarted","Data":"d94c6629cf57e621ff1e62b33f9a4a4ff0752d20d51fd2ededa2e979626d97a2"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.026687 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bb6fdf748-gpknz" event={"ID":"a2ebc8a7-dfee-4768-a3c2-976932027197","Type":"ContainerStarted","Data":"48ff14b1ba84fc6415e145cb474d084198e13a444dce26bcee6a8e0d1cfcb6c5"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.028775 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b7c787945-xkhnb" event={"ID":"b37f0510-4911-4842-866a-863c4ac7e7c9","Type":"ContainerStarted","Data":"2acb4f1b5d2fb5b60cedc300f7849d27a119f7386dfe6b681c7e5fa9085979ff"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.031375 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerStarted","Data":"2a480e2be8592da85048d17f7fbce681eaccfaa361744ec82f3f9225ae60acd3"} Dec 05 00:46:35 crc kubenswrapper[4759]: I1205 00:46:35.034782 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" event={"ID":"9502dfee-cb5d-44de-a549-4f0060d29d9b","Type":"ContainerStarted","Data":"3e5749aad1ac66b9f1bef5f9aa683c10e7f6da81d19ad24e7e1f0ca97d785ca0"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.056694 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerStarted","Data":"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.057236 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerStarted","Data":"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.057254 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.057266 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.060407 4759 generic.go:334] "Generic (PLEG): container finished" podID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerID="6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f" exitCode=0 Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.060442 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" event={"ID":"cfb699fb-b45e-408a-a77c-91f48c5b9e08","Type":"ContainerDied","Data":"6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.063143 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bb6fdf748-gpknz" event={"ID":"a2ebc8a7-dfee-4768-a3c2-976932027197","Type":"ContainerStarted","Data":"15ad49bc5e29759859dcbafdfc9aa78758f06afba1d6ca08cd5e4e9725479220"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.063227 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.063276 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.069200 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b7c787945-xkhnb" event={"ID":"b37f0510-4911-4842-866a-863c4ac7e7c9","Type":"ContainerStarted","Data":"aa2aeb213ec0d42ee1bffea70b40dbda3530ff5065a1ca992cf01e61901b3e93"} Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.069747 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.106824 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bb6fdf748-gpknz" podStartSLOduration=4.106803312 podStartE2EDuration="4.106803312s" podCreationTimestamp="2025-12-05 00:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:36.099674057 +0000 UTC m=+1415.315335007" watchObservedRunningTime="2025-12-05 00:46:36.106803312 +0000 UTC m=+1415.322464262" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.107300 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podStartSLOduration=3.107294755 podStartE2EDuration="3.107294755s" podCreationTimestamp="2025-12-05 00:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:36.080347683 +0000 UTC m=+1415.296008633" watchObservedRunningTime="2025-12-05 00:46:36.107294755 +0000 UTC m=+1415.322955705" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.158068 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b7c787945-xkhnb" podStartSLOduration=3.158048931 podStartE2EDuration="3.158048931s" podCreationTimestamp="2025-12-05 00:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:36.150882985 +0000 UTC m=+1415.366543935" watchObservedRunningTime="2025-12-05 00:46:36.158048931 +0000 UTC m=+1415.373709881" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.485339 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bb856b8d-7psj7"] Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.488162 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.490351 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.490544 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.507909 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bb856b8d-7psj7"] Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.648649 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data-custom\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.648736 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-combined-ca-bundle\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.648774 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2tsz\" (UniqueName: \"kubernetes.io/projected/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-kube-api-access-x2tsz\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.648967 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-logs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.649022 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-internal-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.649099 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.649148 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-public-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.751598 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-logs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752006 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-internal-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752059 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752088 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-public-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752155 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-logs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752200 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data-custom\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752243 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-combined-ca-bundle\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.752285 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2tsz\" (UniqueName: \"kubernetes.io/projected/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-kube-api-access-x2tsz\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.756192 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-combined-ca-bundle\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.757347 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.757415 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-internal-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.758397 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-public-tls-certs\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.758576 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-config-data-custom\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.776710 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2tsz\" (UniqueName: \"kubernetes.io/projected/52bf4fd7-6aa6-4bdf-b8ac-60c071d42455-kube-api-access-x2tsz\") pod \"barbican-api-bb856b8d-7psj7\" (UID: \"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455\") " pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:36 crc kubenswrapper[4759]: I1205 00:46:36.819590 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:37 crc kubenswrapper[4759]: I1205 00:46:37.763784 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bb856b8d-7psj7"] Dec 05 00:46:37 crc kubenswrapper[4759]: W1205 00:46:37.771231 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bf4fd7_6aa6_4bdf_b8ac_60c071d42455.slice/crio-c825a7ad741954d03643bbfbca38fa493e1becd01dacf130fcaab71c98150a4e WatchSource:0}: Error finding container c825a7ad741954d03643bbfbca38fa493e1becd01dacf130fcaab71c98150a4e: Status 404 returned error can't find the container with id c825a7ad741954d03643bbfbca38fa493e1becd01dacf130fcaab71c98150a4e Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.289662 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" event={"ID":"cfb699fb-b45e-408a-a77c-91f48c5b9e08","Type":"ContainerStarted","Data":"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.290582 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.292525 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" event={"ID":"0e7388a6-d295-4807-8ce7-1eeb7dc55707","Type":"ContainerStarted","Data":"02da738ad495a4421f04a2b65f26d426a8f0afc9219adf10cff6f29aeff8a009"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.292607 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" event={"ID":"0e7388a6-d295-4807-8ce7-1eeb7dc55707","Type":"ContainerStarted","Data":"acf65b4c5ea400fdb2611bf3db5a4961124b4d5f4236464fe5c3f1dc0665cc84"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.295037 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb856b8d-7psj7" event={"ID":"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455","Type":"ContainerStarted","Data":"c825a7ad741954d03643bbfbca38fa493e1becd01dacf130fcaab71c98150a4e"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.304645 4759 generic.go:334] "Generic (PLEG): container finished" podID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" containerID="f0080ba3f01b33a6e53a87494e4ada708db52a9e16f2c01af9cbf29c650c23b1" exitCode=0 Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.304731 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xbx7b" event={"ID":"ed73e23b-4161-4968-93d0-aaabce1aa4bb","Type":"ContainerDied","Data":"f0080ba3f01b33a6e53a87494e4ada708db52a9e16f2c01af9cbf29c650c23b1"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.308140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" event={"ID":"9502dfee-cb5d-44de-a549-4f0060d29d9b","Type":"ContainerStarted","Data":"54f8aa2f0f1753bd35d785e9b628de9c49c15347ff2b7cb81220daee24e686bd"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.308207 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" event={"ID":"9502dfee-cb5d-44de-a549-4f0060d29d9b","Type":"ContainerStarted","Data":"c4fb5a25681efd2059498f282458725efd61217c338bf3c0bfddc3ed42df4802"} Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.319734 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" podStartSLOduration=5.319717831 podStartE2EDuration="5.319717831s" podCreationTimestamp="2025-12-05 00:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:38.317583389 +0000 UTC m=+1417.533244339" watchObservedRunningTime="2025-12-05 00:46:38.319717831 +0000 UTC m=+1417.535378781" Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.346110 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-77bd8fcb75-d6pnc" podStartSLOduration=2.96542966 podStartE2EDuration="5.345242048s" podCreationTimestamp="2025-12-05 00:46:33 +0000 UTC" firstStartedPulling="2025-12-05 00:46:34.862181615 +0000 UTC m=+1414.077842565" lastFinishedPulling="2025-12-05 00:46:37.241994013 +0000 UTC m=+1416.457654953" observedRunningTime="2025-12-05 00:46:38.332259289 +0000 UTC m=+1417.547920249" watchObservedRunningTime="2025-12-05 00:46:38.345242048 +0000 UTC m=+1417.560902998" Dec 05 00:46:38 crc kubenswrapper[4759]: I1205 00:46:38.357860 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d8dd9dc58-8n9k2" podStartSLOduration=2.818049832 podStartE2EDuration="5.357837947s" podCreationTimestamp="2025-12-05 00:46:33 +0000 UTC" firstStartedPulling="2025-12-05 00:46:34.691472974 +0000 UTC m=+1413.907133924" lastFinishedPulling="2025-12-05 00:46:37.231261089 +0000 UTC m=+1416.446922039" observedRunningTime="2025-12-05 00:46:38.356689689 +0000 UTC m=+1417.572350639" watchObservedRunningTime="2025-12-05 00:46:38.357837947 +0000 UTC m=+1417.573498917" Dec 05 00:46:39 crc kubenswrapper[4759]: I1205 00:46:39.319598 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb856b8d-7psj7" event={"ID":"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455","Type":"ContainerStarted","Data":"164eff8b2fa045e08b0b1e6b260d3643b425a6d8a140149cb8f865601da39249"} Dec 05 00:46:39 crc kubenswrapper[4759]: I1205 00:46:39.322098 4759 generic.go:334] "Generic (PLEG): container finished" podID="8fe2c3db-f452-4009-abca-b9ee975ad38d" containerID="598b58270ecae4c3e21a1f7bb2001bba2b3512ef5d43292a7ed347f7b30bd769" exitCode=0 Dec 05 00:46:39 crc kubenswrapper[4759]: I1205 00:46:39.322930 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wdw54" event={"ID":"8fe2c3db-f452-4009-abca-b9ee975ad38d","Type":"ContainerDied","Data":"598b58270ecae4c3e21a1f7bb2001bba2b3512ef5d43292a7ed347f7b30bd769"} Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.324609 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xbx7b" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.337790 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wdw54" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.376589 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wdw54" event={"ID":"8fe2c3db-f452-4009-abca-b9ee975ad38d","Type":"ContainerDied","Data":"dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be"} Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.376635 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd1c85f68de91b063c96f79309eb4d90428417c1d5a61e38645affa74083c1be" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.376651 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wdw54" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.378776 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xbx7b" event={"ID":"ed73e23b-4161-4968-93d0-aaabce1aa4bb","Type":"ContainerDied","Data":"1cf64d4bc0b68716976321762bf0b25a7e4dc5356d33ea56dd706a6db951abb6"} Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.378797 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf64d4bc0b68716976321762bf0b25a7e4dc5356d33ea56dd706a6db951abb6" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.378931 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xbx7b" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.410912 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data\") pod \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.410960 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle\") pod \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.411135 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ttwn\" (UniqueName: \"kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn\") pod \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\" (UID: \"ed73e23b-4161-4968-93d0-aaabce1aa4bb\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.416462 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn" (OuterVolumeSpecName: "kube-api-access-6ttwn") pod "ed73e23b-4161-4968-93d0-aaabce1aa4bb" (UID: "ed73e23b-4161-4968-93d0-aaabce1aa4bb"). InnerVolumeSpecName "kube-api-access-6ttwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.480179 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed73e23b-4161-4968-93d0-aaabce1aa4bb" (UID: "ed73e23b-4161-4968-93d0-aaabce1aa4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.494753 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data" (OuterVolumeSpecName: "config-data") pod "ed73e23b-4161-4968-93d0-aaabce1aa4bb" (UID: "ed73e23b-4161-4968-93d0-aaabce1aa4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.513176 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.513241 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7vnd\" (UniqueName: \"kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.513294 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514245 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514275 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514390 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle\") pod \"8fe2c3db-f452-4009-abca-b9ee975ad38d\" (UID: \"8fe2c3db-f452-4009-abca-b9ee975ad38d\") " Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514949 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514966 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed73e23b-4161-4968-93d0-aaabce1aa4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.514977 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ttwn\" (UniqueName: \"kubernetes.io/projected/ed73e23b-4161-4968-93d0-aaabce1aa4bb-kube-api-access-6ttwn\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.515411 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.517335 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts" (OuterVolumeSpecName: "scripts") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.518700 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.520520 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd" (OuterVolumeSpecName: "kube-api-access-r7vnd") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "kube-api-access-r7vnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.540577 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.568820 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data" (OuterVolumeSpecName: "config-data") pod "8fe2c3db-f452-4009-abca-b9ee975ad38d" (UID: "8fe2c3db-f452-4009-abca-b9ee975ad38d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617618 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617671 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7vnd\" (UniqueName: \"kubernetes.io/projected/8fe2c3db-f452-4009-abca-b9ee975ad38d-kube-api-access-r7vnd\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617691 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617710 4759 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617728 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fe2c3db-f452-4009-abca-b9ee975ad38d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.617749 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe2c3db-f452-4009-abca-b9ee975ad38d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.874646 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.960057 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:43 crc kubenswrapper[4759]: I1205 00:46:43.960317 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="dnsmasq-dns" containerID="cri-o://444351a4492c3a62f57421dbcaccd69251cb813a90111f81776d50d40d0577e7" gracePeriod=10 Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.405945 4759 generic.go:334] "Generic (PLEG): container finished" podID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerID="444351a4492c3a62f57421dbcaccd69251cb813a90111f81776d50d40d0577e7" exitCode=0 Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.406136 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" event={"ID":"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792","Type":"ContainerDied","Data":"444351a4492c3a62f57421dbcaccd69251cb813a90111f81776d50d40d0577e7"} Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.653395 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:46:44 crc kubenswrapper[4759]: E1205 00:46:44.653891 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" containerName="heat-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.653913 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" containerName="heat-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: E1205 00:46:44.653937 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" containerName="cinder-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.653944 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" containerName="cinder-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.654115 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" containerName="heat-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.654141 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" containerName="cinder-db-sync" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.655210 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.668604 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t9gmd" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.668776 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.668972 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.669268 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.673813 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755441 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755505 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksmt\" (UniqueName: \"kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755570 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755614 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755636 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.755675 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.756054 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.757747 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.858381 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.858441 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.858473 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859488 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859553 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859637 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859672 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859716 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksmt\" (UniqueName: \"kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859781 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9dh\" (UniqueName: \"kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859817 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.859855 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.860810 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.875529 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.876484 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.876908 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.877118 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.883050 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.896809 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksmt\" (UniqueName: \"kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt\") pod \"cinder-scheduler-0\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " pod="openstack/cinder-scheduler-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.928025 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.929838 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.935208 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.952606 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.963876 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9dh\" (UniqueName: \"kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.963923 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.965955 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.972409 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.972469 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.972488 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.972665 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.973514 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.973661 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.974190 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.975994 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:44 crc kubenswrapper[4759]: I1205 00:46:44.980920 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9dh\" (UniqueName: \"kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh\") pod \"dnsmasq-dns-6bb4fc677f-pq529\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.017562 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076101 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076171 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076199 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076262 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076355 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076426 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxgx\" (UniqueName: \"kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.076556 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.113062 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178594 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178664 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178720 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxgx\" (UniqueName: \"kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178797 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178824 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178858 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.178885 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.179198 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.179426 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.183343 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.183649 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.192697 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.193208 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.196821 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxgx\" (UniqueName: \"kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx\") pod \"cinder-api-0\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.330197 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.795062 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889243 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txthv\" (UniqueName: \"kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889405 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889490 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889547 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889590 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.889742 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config\") pod \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\" (UID: \"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792\") " Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.893980 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv" (OuterVolumeSpecName: "kube-api-access-txthv") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "kube-api-access-txthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.934812 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.950244 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.974324 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.980601 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config" (OuterVolumeSpecName: "config") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.983841 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" (UID: "283e03ad-b9ab-4226-b4fe-5c3fa1dc1792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992600 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txthv\" (UniqueName: \"kubernetes.io/projected/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-kube-api-access-txthv\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992638 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992647 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992657 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992668 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:45 crc kubenswrapper[4759]: I1205 00:46:45.992676 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.123776 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.309124 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.428993 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.429096 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-9wz2q" event={"ID":"283e03ad-b9ab-4226-b4fe-5c3fa1dc1792","Type":"ContainerDied","Data":"aa83f15772ea2819891ac11e771fc71e852bd675ad6b6119202766f4c78f3eb9"} Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.429238 4759 scope.go:117] "RemoveContainer" containerID="444351a4492c3a62f57421dbcaccd69251cb813a90111f81776d50d40d0577e7" Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.478496 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.487339 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-9wz2q"] Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.574331 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:46 crc kubenswrapper[4759]: I1205 00:46:46.838186 4759 scope.go:117] "RemoveContainer" containerID="5f43acf4f73432f27279edfab1df1e9454cdf47e187484e46d03991c661436aa" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.174695 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" path="/var/lib/kubelet/pods/283e03ad-b9ab-4226-b4fe-5c3fa1dc1792/volumes" Dec 05 00:46:47 crc kubenswrapper[4759]: E1205 00:46:47.215556 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.490287 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.510152 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb856b8d-7psj7" event={"ID":"52bf4fd7-6aa6-4bdf-b8ac-60c071d42455","Type":"ContainerStarted","Data":"bf5f1ef2c0b36d6a926763f5765a26f05c1321dd057b4b6544e5277ae09c630c"} Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.510928 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.511217 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.558416 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb856b8d-7psj7" podUID="52bf4fd7-6aa6-4bdf-b8ac-60c071d42455" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.184:9311/healthcheck\": dial tcp 10.217.0.184:9311: connect: connection refused" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.617542 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bb856b8d-7psj7" podStartSLOduration=11.617518634 podStartE2EDuration="11.617518634s" podCreationTimestamp="2025-12-05 00:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:47.614436699 +0000 UTC m=+1426.830097639" watchObservedRunningTime="2025-12-05 00:46:47.617518634 +0000 UTC m=+1426.833179584" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.649774 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerStarted","Data":"a502fdb3e6fbb61dbb52f4c2ddbe7afcf99b7fcc8742b2a28917d5cb759d3cf4"} Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.649962 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="ceilometer-notification-agent" containerID="cri-o://c70695da30aaa03429fa5b6dfb1e3bdf0e3be0a4745c0538e1f7b2f35b7181b6" gracePeriod=30 Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.650219 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.650490 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="proxy-httpd" containerID="cri-o://a502fdb3e6fbb61dbb52f4c2ddbe7afcf99b7fcc8742b2a28917d5cb759d3cf4" gracePeriod=30 Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.650535 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="sg-core" containerID="cri-o://0175c3717dc074762ceb841920ecb3659b8e71c7df9ee3746ec3991c29e0a2e2" gracePeriod=30 Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.661041 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.671213 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:46:47 crc kubenswrapper[4759]: I1205 00:46:47.933938 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.694193 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerStarted","Data":"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.695291 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerStarted","Data":"f522edf8da92017d5c2c277db320ba711cc39adf0112358a6509e2c15782415d"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.699092 4759 generic.go:334] "Generic (PLEG): container finished" podID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerID="36221485da31ece5df6e42705967fb5874d4ba4c9c4a9a04d04312a3f08298c2" exitCode=0 Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.699249 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" event={"ID":"50bae1fb-68c4-48a8-a768-d550bc43aa48","Type":"ContainerDied","Data":"36221485da31ece5df6e42705967fb5874d4ba4c9c4a9a04d04312a3f08298c2"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.699926 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" event={"ID":"50bae1fb-68c4-48a8-a768-d550bc43aa48","Type":"ContainerStarted","Data":"2dfb0cbb853645a030b0304edc7f9369ae7c59b94af31dbdd6f9ace85650a8f4"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.703354 4759 generic.go:334] "Generic (PLEG): container finished" podID="9ba26284-ae89-4046-bc8c-acf49206704f" containerID="a502fdb3e6fbb61dbb52f4c2ddbe7afcf99b7fcc8742b2a28917d5cb759d3cf4" exitCode=0 Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.703428 4759 generic.go:334] "Generic (PLEG): container finished" podID="9ba26284-ae89-4046-bc8c-acf49206704f" containerID="0175c3717dc074762ceb841920ecb3659b8e71c7df9ee3746ec3991c29e0a2e2" exitCode=2 Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.703557 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerDied","Data":"a502fdb3e6fbb61dbb52f4c2ddbe7afcf99b7fcc8742b2a28917d5cb759d3cf4"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.703629 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerDied","Data":"0175c3717dc074762ceb841920ecb3659b8e71c7df9ee3746ec3991c29e0a2e2"} Dec 05 00:46:48 crc kubenswrapper[4759]: I1205 00:46:48.707068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerStarted","Data":"bfd77554753b5aef900ff3c9561ab9f6ca96d7cac39799d53175e91a840dcbc8"} Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.733507 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" event={"ID":"50bae1fb-68c4-48a8-a768-d550bc43aa48","Type":"ContainerStarted","Data":"028fec5be9d31a23684828f919f971187ab18b6976eb450551fce05317e6858a"} Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.734368 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.750440 4759 generic.go:334] "Generic (PLEG): container finished" podID="9ba26284-ae89-4046-bc8c-acf49206704f" containerID="c70695da30aaa03429fa5b6dfb1e3bdf0e3be0a4745c0538e1f7b2f35b7181b6" exitCode=0 Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.750536 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerDied","Data":"c70695da30aaa03429fa5b6dfb1e3bdf0e3be0a4745c0538e1f7b2f35b7181b6"} Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.754961 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" podStartSLOduration=5.754951802 podStartE2EDuration="5.754951802s" podCreationTimestamp="2025-12-05 00:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:49.752656936 +0000 UTC m=+1428.968317886" watchObservedRunningTime="2025-12-05 00:46:49.754951802 +0000 UTC m=+1428.970612742" Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.756795 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerStarted","Data":"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21"} Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.759326 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerStarted","Data":"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745"} Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.759570 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api-log" containerID="cri-o://72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" gracePeriod=30 Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.759619 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api" containerID="cri-o://18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" gracePeriod=30 Dec 05 00:46:49 crc kubenswrapper[4759]: I1205 00:46:49.780517 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.780492445 podStartE2EDuration="5.780492445s" podCreationTimestamp="2025-12-05 00:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:49.77617476 +0000 UTC m=+1428.991835710" watchObservedRunningTime="2025-12-05 00:46:49.780492445 +0000 UTC m=+1428.996153395" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.085063 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.208295 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274594 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274687 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c26c7\" (UniqueName: \"kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274730 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274770 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274883 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274907 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.274937 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts\") pod \"9ba26284-ae89-4046-bc8c-acf49206704f\" (UID: \"9ba26284-ae89-4046-bc8c-acf49206704f\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.276681 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.276972 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.283497 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts" (OuterVolumeSpecName: "scripts") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.284644 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7" (OuterVolumeSpecName: "kube-api-access-c26c7") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "kube-api-access-c26c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.314266 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.330384 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.354119 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.377407 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c26c7\" (UniqueName: \"kubernetes.io/projected/9ba26284-ae89-4046-bc8c-acf49206704f-kube-api-access-c26c7\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.377663 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.377749 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.378361 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ba26284-ae89-4046-bc8c-acf49206704f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.378433 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.378603 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.405503 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data" (OuterVolumeSpecName: "config-data") pod "9ba26284-ae89-4046-bc8c-acf49206704f" (UID: "9ba26284-ae89-4046-bc8c-acf49206704f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.480850 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.481523 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba26284-ae89-4046-bc8c-acf49206704f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582286 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582344 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582384 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582486 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582520 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xxgx\" (UniqueName: \"kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582605 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.582701 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs\") pod \"394923f3-7d21-4ec6-b253-d202eb6439bf\" (UID: \"394923f3-7d21-4ec6-b253-d202eb6439bf\") " Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.583536 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.583699 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs" (OuterVolumeSpecName: "logs") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.585816 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394923f3-7d21-4ec6-b253-d202eb6439bf-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.585852 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/394923f3-7d21-4ec6-b253-d202eb6439bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.588194 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts" (OuterVolumeSpecName: "scripts") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.588979 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx" (OuterVolumeSpecName: "kube-api-access-4xxgx") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "kube-api-access-4xxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.596243 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.637860 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.728930 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.728967 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.728977 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.728986 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xxgx\" (UniqueName: \"kubernetes.io/projected/394923f3-7d21-4ec6-b253-d202eb6439bf-kube-api-access-4xxgx\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.736867 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data" (OuterVolumeSpecName: "config-data") pod "394923f3-7d21-4ec6-b253-d202eb6439bf" (UID: "394923f3-7d21-4ec6-b253-d202eb6439bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770222 4759 generic.go:334] "Generic (PLEG): container finished" podID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerID="18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" exitCode=0 Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770254 4759 generic.go:334] "Generic (PLEG): container finished" podID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerID="72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" exitCode=143 Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770278 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerDied","Data":"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745"} Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770339 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerDied","Data":"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29"} Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770353 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"394923f3-7d21-4ec6-b253-d202eb6439bf","Type":"ContainerDied","Data":"f522edf8da92017d5c2c277db320ba711cc39adf0112358a6509e2c15782415d"} Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770370 4759 scope.go:117] "RemoveContainer" containerID="18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.770504 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.786728 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ba26284-ae89-4046-bc8c-acf49206704f","Type":"ContainerDied","Data":"82cf2cef856fb66be7c38fcee33f05d0ebc7d0fea566212ac94ca43deb2fb9eb"} Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.786850 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.798771 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerStarted","Data":"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2"} Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.823219 4759 scope.go:117] "RemoveContainer" containerID="72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.836850 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.839286 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394923f3-7d21-4ec6-b253-d202eb6439bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.863029 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.865879 4759 scope.go:117] "RemoveContainer" containerID="18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.879656 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745\": container with ID starting with 18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745 not found: ID does not exist" containerID="18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.879731 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745"} err="failed to get container status \"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745\": rpc error: code = NotFound desc = could not find container \"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745\": container with ID starting with 18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745 not found: ID does not exist" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.879772 4759 scope.go:117] "RemoveContainer" containerID="72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.880434 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29\": container with ID starting with 72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29 not found: ID does not exist" containerID="72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881045 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29"} err="failed to get container status \"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29\": rpc error: code = NotFound desc = could not find container \"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29\": container with ID starting with 72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29 not found: ID does not exist" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881089 4759 scope.go:117] "RemoveContainer" containerID="18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881540 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745"} err="failed to get container status \"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745\": rpc error: code = NotFound desc = could not find container \"18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745\": container with ID starting with 18d77ea3312d6fea6a6f4f0dd10ed335137ca977ea0d2ea4814190346756f745 not found: ID does not exist" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881569 4759 scope.go:117] "RemoveContainer" containerID="72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881824 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29"} err="failed to get container status \"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29\": rpc error: code = NotFound desc = could not find container \"72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29\": container with ID starting with 72ebaacf5f73d723a17103183c251e039e964222e2a9c40814df18a0f7f78b29 not found: ID does not exist" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.881852 4759 scope.go:117] "RemoveContainer" containerID="a502fdb3e6fbb61dbb52f4c2ddbe7afcf99b7fcc8742b2a28917d5cb759d3cf4" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.889431 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.889435 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.043540019 podStartE2EDuration="6.889424516s" podCreationTimestamp="2025-12-05 00:46:44 +0000 UTC" firstStartedPulling="2025-12-05 00:46:47.701947847 +0000 UTC m=+1426.917608797" lastFinishedPulling="2025-12-05 00:46:48.547832334 +0000 UTC m=+1427.763493294" observedRunningTime="2025-12-05 00:46:50.847642846 +0000 UTC m=+1430.063303796" watchObservedRunningTime="2025-12-05 00:46:50.889424516 +0000 UTC m=+1430.105085466" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.889907 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="ceilometer-notification-agent" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.889930 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="ceilometer-notification-agent" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.889957 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="init" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.889968 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="init" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.889995 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="sg-core" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890003 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="sg-core" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.890015 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="proxy-httpd" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890025 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="proxy-httpd" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.890042 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api-log" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890050 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api-log" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.890070 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="dnsmasq-dns" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890078 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="dnsmasq-dns" Dec 05 00:46:50 crc kubenswrapper[4759]: E1205 00:46:50.890090 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890098 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890343 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="ceilometer-notification-agent" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890372 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="sg-core" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890392 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api-log" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890406 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="283e03ad-b9ab-4226-b4fe-5c3fa1dc1792" containerName="dnsmasq-dns" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890415 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" containerName="proxy-httpd" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.890432 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" containerName="cinder-api" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.893657 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.898628 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.899073 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.899746 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.939236 4759 scope.go:117] "RemoveContainer" containerID="0175c3717dc074762ceb841920ecb3659b8e71c7df9ee3746ec3991c29e0a2e2" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.945224 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.978837 4759 scope.go:117] "RemoveContainer" containerID="c70695da30aaa03429fa5b6dfb1e3bdf0e3be0a4745c0538e1f7b2f35b7181b6" Dec 05 00:46:50 crc kubenswrapper[4759]: I1205 00:46:50.995163 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.004723 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.023824 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.026718 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.029564 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.029725 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.036266 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042082 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042131 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7glzn\" (UniqueName: \"kubernetes.io/projected/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-kube-api-access-7glzn\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042187 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042249 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042436 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042479 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042508 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042556 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-scripts\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.042577 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-logs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.144876 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-scripts\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.144963 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-logs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.144994 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145051 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7glzn\" (UniqueName: \"kubernetes.io/projected/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-kube-api-access-7glzn\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145126 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145206 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145235 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145280 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145358 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145424 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145486 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145521 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7wq\" (UniqueName: \"kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145538 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145579 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145618 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145661 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145533 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-logs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.145625 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.151359 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.152084 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.152212 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data-custom\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.152990 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-scripts\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.153295 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.155921 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-config-data\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.167647 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394923f3-7d21-4ec6-b253-d202eb6439bf" path="/var/lib/kubelet/pods/394923f3-7d21-4ec6-b253-d202eb6439bf/volumes" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.168455 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba26284-ae89-4046-bc8c-acf49206704f" path="/var/lib/kubelet/pods/9ba26284-ae89-4046-bc8c-acf49206704f/volumes" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.175099 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7glzn\" (UniqueName: \"kubernetes.io/projected/8ba6782e-a35c-4c30-ae5f-5efb85cc001c-kube-api-access-7glzn\") pod \"cinder-api-0\" (UID: \"8ba6782e-a35c-4c30-ae5f-5efb85cc001c\") " pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.231575 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247718 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247791 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247847 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247910 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7wq\" (UniqueName: \"kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247935 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247962 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.247980 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.248917 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.249102 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.253218 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.260946 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.261530 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.263499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.267873 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7wq\" (UniqueName: \"kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq\") pod \"ceilometer-0\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.377003 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.816495 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 00:46:51 crc kubenswrapper[4759]: I1205 00:46:51.936843 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:46:52 crc kubenswrapper[4759]: I1205 00:46:52.882849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerStarted","Data":"51b86621748a9c5173f2c6a1b2a88caf457c0ec920c8e6a7a2810d02eccc2ced"} Dec 05 00:46:52 crc kubenswrapper[4759]: I1205 00:46:52.892776 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ba6782e-a35c-4c30-ae5f-5efb85cc001c","Type":"ContainerStarted","Data":"1edbfa5cd8558b0eaba3b0116521272665bfb5a6e072753f3b6286935691c3c2"} Dec 05 00:46:52 crc kubenswrapper[4759]: I1205 00:46:52.892844 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ba6782e-a35c-4c30-ae5f-5efb85cc001c","Type":"ContainerStarted","Data":"c77395910f88be262151d7393d534a8f646684eaed0cfafcfad25f4bcdbf9a21"} Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.028059 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7677b6f8d5-zwkn7" Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.137281 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.144613 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd7d6d9bd-hpq49" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-api" containerID="cri-o://cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3" gracePeriod=30 Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.145273 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd7d6d9bd-hpq49" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-httpd" containerID="cri-o://83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30" gracePeriod=30 Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.904747 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerStarted","Data":"72737862ad4a55821e89a60dda0106661dc8f94b543fca01302e7b62aee0db14"} Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.905115 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerStarted","Data":"7c4ae836e5a36251b9a4ecfdc0b6b4741d86453eb944cbd1e5c74bbb8775d24e"} Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.907714 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8ba6782e-a35c-4c30-ae5f-5efb85cc001c","Type":"ContainerStarted","Data":"d903d665b599f6fe315b2f57f55d234e996538db66dec530be4381aed991fbe7"} Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.907809 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.911067 4759 generic.go:334] "Generic (PLEG): container finished" podID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerID="83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30" exitCode=0 Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.911105 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerDied","Data":"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30"} Dec 05 00:46:53 crc kubenswrapper[4759]: I1205 00:46:53.930201 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.930163033 podStartE2EDuration="3.930163033s" podCreationTimestamp="2025-12-05 00:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:46:53.927964059 +0000 UTC m=+1433.143625029" watchObservedRunningTime="2025-12-05 00:46:53.930163033 +0000 UTC m=+1433.145823993" Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.928079 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.928407 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerStarted","Data":"77599f234268d4d8e87fbe877fa628451eeaeec3f0eeab413d118551fdd81ccb"} Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.930873 4759 generic.go:334] "Generic (PLEG): container finished" podID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerID="cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3" exitCode=0 Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.930985 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerDied","Data":"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3"} Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.931064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd7d6d9bd-hpq49" event={"ID":"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372","Type":"ContainerDied","Data":"abf84cec919fc214c54244eafdeb985c7811374d42bfdb566351ea889e690848"} Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.931106 4759 scope.go:117] "RemoveContainer" containerID="83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30" Dec 05 00:46:54 crc kubenswrapper[4759]: I1205 00:46:54.982147 4759 scope.go:117] "RemoveContainer" containerID="cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.009410 4759 scope.go:117] "RemoveContainer" containerID="83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30" Dec 05 00:46:55 crc kubenswrapper[4759]: E1205 00:46:55.009996 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30\": container with ID starting with 83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30 not found: ID does not exist" containerID="83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.010046 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30"} err="failed to get container status \"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30\": rpc error: code = NotFound desc = could not find container \"83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30\": container with ID starting with 83ee476b88bc5bdfa5a14d982c76ad4913202b8302705b50457cf31b195c1e30 not found: ID does not exist" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.010084 4759 scope.go:117] "RemoveContainer" containerID="cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3" Dec 05 00:46:55 crc kubenswrapper[4759]: E1205 00:46:55.011023 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3\": container with ID starting with cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3 not found: ID does not exist" containerID="cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.011057 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3"} err="failed to get container status \"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3\": rpc error: code = NotFound desc = could not find container \"cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3\": container with ID starting with cd760ff4f59926141c8e278075e0e3837283f12de8aa2ca687848bf5a35899b3 not found: ID does not exist" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.019216 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.033142 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle\") pod \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.033266 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config\") pod \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.033292 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config\") pod \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.033451 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs\") pod \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.033524 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255lk\" (UniqueName: \"kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk\") pod \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\" (UID: \"75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.053370 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk" (OuterVolumeSpecName: "kube-api-access-255lk") pod "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" (UID: "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372"). InnerVolumeSpecName "kube-api-access-255lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.059623 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" (UID: "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.094965 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" (UID: "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.114638 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.123607 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" (UID: "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.125278 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config" (OuterVolumeSpecName: "config") pod "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" (UID: "75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.147187 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.147217 4759 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.147228 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.147236 4759 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.147244 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-255lk\" (UniqueName: \"kubernetes.io/projected/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372-kube-api-access-255lk\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.211505 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.211778 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="dnsmasq-dns" containerID="cri-o://af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381" gracePeriod=10 Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.264861 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.696176 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.861996 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.862155 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpc2g\" (UniqueName: \"kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.862203 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.862239 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.862441 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.862583 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb\") pod \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\" (UID: \"cfb699fb-b45e-408a-a77c-91f48c5b9e08\") " Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.872293 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g" (OuterVolumeSpecName: "kube-api-access-hpc2g") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "kube-api-access-hpc2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.917885 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config" (OuterVolumeSpecName: "config") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.922084 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.941809 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.959896 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.964749 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpc2g\" (UniqueName: \"kubernetes.io/projected/cfb699fb-b45e-408a-a77c-91f48c5b9e08-kube-api-access-hpc2g\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.964781 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.964791 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.964802 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.964811 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.988552 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd7d6d9bd-hpq49" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.998616 4759 generic.go:334] "Generic (PLEG): container finished" podID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerID="af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381" exitCode=0 Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.998671 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.998688 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" event={"ID":"cfb699fb-b45e-408a-a77c-91f48c5b9e08","Type":"ContainerDied","Data":"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381"} Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.999652 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4fzp8" event={"ID":"cfb699fb-b45e-408a-a77c-91f48c5b9e08","Type":"ContainerDied","Data":"312d61be45e0b91bc5689e18f03be8437a3cbfb949daa74494662ea6eb95e24e"} Dec 05 00:46:55 crc kubenswrapper[4759]: I1205 00:46:55.999763 4759 scope.go:117] "RemoveContainer" containerID="af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.026952 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.027010 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerStarted","Data":"6ef9020881c3b4cfd947de38180d29239144d6a23f1cf11d71ca87dc9decd683"} Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.027668 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.037694 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cfb699fb-b45e-408a-a77c-91f48c5b9e08" (UID: "cfb699fb-b45e-408a-a77c-91f48c5b9e08"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.044890 4759 scope.go:117] "RemoveContainer" containerID="6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.060883 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bd7d6d9bd-hpq49"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.066724 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfb699fb-b45e-408a-a77c-91f48c5b9e08-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.077184 4759 scope.go:117] "RemoveContainer" containerID="af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381" Dec 05 00:46:56 crc kubenswrapper[4759]: E1205 00:46:56.078222 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381\": container with ID starting with af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381 not found: ID does not exist" containerID="af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.078272 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381"} err="failed to get container status \"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381\": rpc error: code = NotFound desc = could not find container \"af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381\": container with ID starting with af02233544209b1fafeb02fc9583be4bcddcab5afe6fc054900010b7b5cdb381 not found: ID does not exist" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.078316 4759 scope.go:117] "RemoveContainer" containerID="6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f" Dec 05 00:46:56 crc kubenswrapper[4759]: E1205 00:46:56.083228 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f\": container with ID starting with 6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f not found: ID does not exist" containerID="6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.083263 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f"} err="failed to get container status \"6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f\": rpc error: code = NotFound desc = could not find container \"6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f\": container with ID starting with 6974db3af9cab0e6d2b8359b751f03f72125fc284a9a509cebe32de391e0db4f not found: ID does not exist" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.088092 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.491020679 podStartE2EDuration="6.08807433s" podCreationTimestamp="2025-12-05 00:46:50 +0000 UTC" firstStartedPulling="2025-12-05 00:46:51.946336287 +0000 UTC m=+1431.161997237" lastFinishedPulling="2025-12-05 00:46:55.543389918 +0000 UTC m=+1434.759050888" observedRunningTime="2025-12-05 00:46:56.071972857 +0000 UTC m=+1435.287633807" watchObservedRunningTime="2025-12-05 00:46:56.08807433 +0000 UTC m=+1435.303735280" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.089373 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.412892 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.423294 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4fzp8"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.836456 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bb856b8d-7psj7" Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.948329 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.950007 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api-log" containerID="cri-o://076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff" gracePeriod=30 Dec 05 00:46:56 crc kubenswrapper[4759]: I1205 00:46:56.950421 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api" containerID="cri-o://88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c" gracePeriod=30 Dec 05 00:46:57 crc kubenswrapper[4759]: I1205 00:46:57.049329 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="cinder-scheduler" containerID="cri-o://53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21" gracePeriod=30 Dec 05 00:46:57 crc kubenswrapper[4759]: I1205 00:46:57.049943 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="probe" containerID="cri-o://38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2" gracePeriod=30 Dec 05 00:46:57 crc kubenswrapper[4759]: I1205 00:46:57.167791 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" path="/var/lib/kubelet/pods/75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372/volumes" Dec 05 00:46:57 crc kubenswrapper[4759]: I1205 00:46:57.168538 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" path="/var/lib/kubelet/pods/cfb699fb-b45e-408a-a77c-91f48c5b9e08/volumes" Dec 05 00:46:58 crc kubenswrapper[4759]: I1205 00:46:58.064864 4759 generic.go:334] "Generic (PLEG): container finished" podID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerID="38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2" exitCode=0 Dec 05 00:46:58 crc kubenswrapper[4759]: I1205 00:46:58.064941 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerDied","Data":"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2"} Dec 05 00:46:58 crc kubenswrapper[4759]: I1205 00:46:58.070574 4759 generic.go:334] "Generic (PLEG): container finished" podID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerID="076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff" exitCode=143 Dec 05 00:46:58 crc kubenswrapper[4759]: I1205 00:46:58.070609 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerDied","Data":"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff"} Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.123396 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:57400->10.217.0.183:9311: read: connection reset by peer" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.124324 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:57416->10.217.0.183:9311: read: connection reset by peer" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.619645 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.788051 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle\") pod \"2b80deca-451b-446c-88b9-42c4521b4cc8\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.788275 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data\") pod \"2b80deca-451b-446c-88b9-42c4521b4cc8\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.788452 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs\") pod \"2b80deca-451b-446c-88b9-42c4521b4cc8\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.788549 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6lv6\" (UniqueName: \"kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6\") pod \"2b80deca-451b-446c-88b9-42c4521b4cc8\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.788576 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom\") pod \"2b80deca-451b-446c-88b9-42c4521b4cc8\" (UID: \"2b80deca-451b-446c-88b9-42c4521b4cc8\") " Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.789662 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs" (OuterVolumeSpecName: "logs") pod "2b80deca-451b-446c-88b9-42c4521b4cc8" (UID: "2b80deca-451b-446c-88b9-42c4521b4cc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.812577 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2b80deca-451b-446c-88b9-42c4521b4cc8" (UID: "2b80deca-451b-446c-88b9-42c4521b4cc8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.813740 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6" (OuterVolumeSpecName: "kube-api-access-r6lv6") pod "2b80deca-451b-446c-88b9-42c4521b4cc8" (UID: "2b80deca-451b-446c-88b9-42c4521b4cc8"). InnerVolumeSpecName "kube-api-access-r6lv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.837354 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b80deca-451b-446c-88b9-42c4521b4cc8" (UID: "2b80deca-451b-446c-88b9-42c4521b4cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.879366 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data" (OuterVolumeSpecName: "config-data") pod "2b80deca-451b-446c-88b9-42c4521b4cc8" (UID: "2b80deca-451b-446c-88b9-42c4521b4cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.892444 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b80deca-451b-446c-88b9-42c4521b4cc8-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.892477 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6lv6\" (UniqueName: \"kubernetes.io/projected/2b80deca-451b-446c-88b9-42c4521b4cc8-kube-api-access-r6lv6\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.892496 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.892509 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.892520 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b80deca-451b-446c-88b9-42c4521b4cc8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:00 crc kubenswrapper[4759]: I1205 00:47:00.927023 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.095966 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096005 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096080 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksmt\" (UniqueName: \"kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096157 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096205 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096355 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.096576 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle\") pod \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\" (UID: \"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea\") " Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.097067 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.100792 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt" (OuterVolumeSpecName: "kube-api-access-2ksmt") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "kube-api-access-2ksmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.100801 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.104159 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts" (OuterVolumeSpecName: "scripts") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.114928 4759 generic.go:334] "Generic (PLEG): container finished" podID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerID="88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c" exitCode=0 Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.114996 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerDied","Data":"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c"} Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.115032 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" event={"ID":"2b80deca-451b-446c-88b9-42c4521b4cc8","Type":"ContainerDied","Data":"2a480e2be8592da85048d17f7fbce681eaccfaa361744ec82f3f9225ae60acd3"} Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.115053 4759 scope.go:117] "RemoveContainer" containerID="88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.115178 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.118862 4759 generic.go:334] "Generic (PLEG): container finished" podID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerID="53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21" exitCode=0 Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.118916 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerDied","Data":"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21"} Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.118956 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d6e8fd3-623c-4cd8-b701-f166efd0a5ea","Type":"ContainerDied","Data":"bfd77554753b5aef900ff3c9561ab9f6ca96d7cac39799d53175e91a840dcbc8"} Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.119002 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.199328 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.199734 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.199760 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.199781 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.199800 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ksmt\" (UniqueName: \"kubernetes.io/projected/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-kube-api-access-2ksmt\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.267765 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data" (OuterVolumeSpecName: "config-data") pod "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" (UID: "3d6e8fd3-623c-4cd8-b701-f166efd0a5ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.296688 4759 scope.go:117] "RemoveContainer" containerID="076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.301806 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.318266 4759 scope.go:117] "RemoveContainer" containerID="88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.318810 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c\": container with ID starting with 88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c not found: ID does not exist" containerID="88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.318861 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c"} err="failed to get container status \"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c\": rpc error: code = NotFound desc = could not find container \"88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c\": container with ID starting with 88eaba7d13a33cd5d95d18427d9bbcee3b24aa2dae638bd8f4064452f08c7d7c not found: ID does not exist" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.318907 4759 scope.go:117] "RemoveContainer" containerID="076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.319214 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff\": container with ID starting with 076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff not found: ID does not exist" containerID="076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.319257 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff"} err="failed to get container status \"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff\": rpc error: code = NotFound desc = could not find container \"076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff\": container with ID starting with 076e7a989ad48106e10d8a82e55b7eabd6713382c7d25604dd4fc321dc1428ff not found: ID does not exist" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.319288 4759 scope.go:117] "RemoveContainer" containerID="38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.343210 4759 scope.go:117] "RemoveContainer" containerID="53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.371355 4759 scope.go:117] "RemoveContainer" containerID="38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.371893 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2\": container with ID starting with 38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2 not found: ID does not exist" containerID="38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.371940 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2"} err="failed to get container status \"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2\": rpc error: code = NotFound desc = could not find container \"38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2\": container with ID starting with 38e2c0b867657ef7a98392b1a667b31732ec0de9f8049694038f02350b4361d2 not found: ID does not exist" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.371971 4759 scope.go:117] "RemoveContainer" containerID="53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.372327 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21\": container with ID starting with 53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21 not found: ID does not exist" containerID="53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.372377 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21"} err="failed to get container status \"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21\": rpc error: code = NotFound desc = could not find container \"53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21\": container with ID starting with 53561b73ffe65a1448d4943520f42049818077961ddf82515d09e220adf83b21 not found: ID does not exist" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.461066 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.481132 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495346 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495806 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="probe" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495828 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="probe" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495854 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="init" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495863 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="init" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495875 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api-log" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495884 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api-log" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495897 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="dnsmasq-dns" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495905 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="dnsmasq-dns" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495932 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-api" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495939 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-api" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495954 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495962 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.495980 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="cinder-scheduler" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.495987 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="cinder-scheduler" Dec 05 00:47:01 crc kubenswrapper[4759]: E1205 00:47:01.496007 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-httpd" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496015 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-httpd" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496239 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb699fb-b45e-408a-a77c-91f48c5b9e08" containerName="dnsmasq-dns" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496261 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="probe" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496283 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496302 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" containerName="barbican-api-log" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496340 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-api" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496356 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ccfa87-d7ed-4bdd-a93d-a8fdd24f0372" containerName="neutron-httpd" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.496374 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" containerName="cinder-scheduler" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.497570 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.499717 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.507937 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610112 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzh9\" (UniqueName: \"kubernetes.io/projected/84365c40-0d24-43ab-b5d1-66c9531bb860-kube-api-access-zfzh9\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610174 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610255 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610517 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610798 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84365c40-0d24-43ab-b5d1-66c9531bb860-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.610934 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-scripts\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.712936 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713020 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713139 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84365c40-0d24-43ab-b5d1-66c9531bb860-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713209 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-scripts\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713387 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzh9\" (UniqueName: \"kubernetes.io/projected/84365c40-0d24-43ab-b5d1-66c9531bb860-kube-api-access-zfzh9\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713422 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.713667 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84365c40-0d24-43ab-b5d1-66c9531bb860-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.719902 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.723042 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-config-data\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.724922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-scripts\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.725815 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84365c40-0d24-43ab-b5d1-66c9531bb860-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.745657 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzh9\" (UniqueName: \"kubernetes.io/projected/84365c40-0d24-43ab-b5d1-66c9531bb860-kube-api-access-zfzh9\") pod \"cinder-scheduler-0\" (UID: \"84365c40-0d24-43ab-b5d1-66c9531bb860\") " pod="openstack/cinder-scheduler-0" Dec 05 00:47:01 crc kubenswrapper[4759]: I1205 00:47:01.819534 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 00:47:02 crc kubenswrapper[4759]: I1205 00:47:02.397182 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 00:47:02 crc kubenswrapper[4759]: W1205 00:47:02.400606 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84365c40_0d24_43ab_b5d1_66c9531bb860.slice/crio-49df2c66d98b7bfe2eabb6801f515e2f584ba6904c8ecdc1cdcd34a3fcb25b53 WatchSource:0}: Error finding container 49df2c66d98b7bfe2eabb6801f515e2f584ba6904c8ecdc1cdcd34a3fcb25b53: Status 404 returned error can't find the container with id 49df2c66d98b7bfe2eabb6801f515e2f584ba6904c8ecdc1cdcd34a3fcb25b53 Dec 05 00:47:02 crc kubenswrapper[4759]: I1205 00:47:02.944538 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 00:47:03 crc kubenswrapper[4759]: I1205 00:47:03.142119 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"84365c40-0d24-43ab-b5d1-66c9531bb860","Type":"ContainerStarted","Data":"9557679e44fde371368a902306be5dd66a05b0e2c991aacfaaacc186bb52a6bf"} Dec 05 00:47:03 crc kubenswrapper[4759]: I1205 00:47:03.142450 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"84365c40-0d24-43ab-b5d1-66c9531bb860","Type":"ContainerStarted","Data":"49df2c66d98b7bfe2eabb6801f515e2f584ba6904c8ecdc1cdcd34a3fcb25b53"} Dec 05 00:47:03 crc kubenswrapper[4759]: I1205 00:47:03.176420 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6e8fd3-623c-4cd8-b701-f166efd0a5ea" path="/var/lib/kubelet/pods/3d6e8fd3-623c-4cd8-b701-f166efd0a5ea/volumes" Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.156272 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"84365c40-0d24-43ab-b5d1-66c9531bb860","Type":"ContainerStarted","Data":"1a91ec32a12c6b2aa437b1c3608b9f4f70e81a76d9e6bee5921334efeff5515b"} Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.187626 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.187607705 podStartE2EDuration="3.187607705s" podCreationTimestamp="2025-12-05 00:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:04.184590852 +0000 UTC m=+1443.400251802" watchObservedRunningTime="2025-12-05 00:47:04.187607705 +0000 UTC m=+1443.403268655" Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.433870 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.433958 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.510547 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:47:04 crc kubenswrapper[4759]: I1205 00:47:04.554357 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bb6fdf748-gpknz" Dec 05 00:47:05 crc kubenswrapper[4759]: I1205 00:47:05.400756 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b7c787945-xkhnb" Dec 05 00:47:06 crc kubenswrapper[4759]: I1205 00:47:06.820139 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.106855 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.108497 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.120045 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.121558 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.122407 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.122601 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.129733 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9jwlj" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.129761 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.130002 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.130156 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hvvg5" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.134068 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.150387 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.250856 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.261726 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.265968 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.283054 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.307810 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.307905 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.307930 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xsp\" (UniqueName: \"kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.307961 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69l57\" (UniqueName: \"kubernetes.io/projected/8effa6a4-dc68-4020-bd47-c83bcdc8d337-kube-api-access-69l57\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308033 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config-secret\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308081 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308290 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308346 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308420 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308444 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.308560 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxx2c\" (UniqueName: \"kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.336427 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.352824 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.352858 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.353703 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.354209 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.356833 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.367176 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412238 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412282 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxx2c\" (UniqueName: \"kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412394 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9ht\" (UniqueName: \"kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412426 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412441 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412472 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412486 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xsp\" (UniqueName: \"kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412516 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69l57\" (UniqueName: \"kubernetes.io/projected/8effa6a4-dc68-4020-bd47-c83bcdc8d337-kube-api-access-69l57\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412540 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config-secret\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412568 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412587 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412623 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412645 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412681 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412728 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412750 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412779 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9j7\" (UniqueName: \"kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412797 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412815 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412832 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.412856 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.418410 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.421455 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.425693 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-openstack-config-secret\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.427432 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.429072 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.433883 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8effa6a4-dc68-4020-bd47-c83bcdc8d337-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.434062 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.434240 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxx2c\" (UniqueName: \"kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c\") pod \"heat-engine-54dff77467-gvp7s\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.435171 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.436550 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.437058 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xsp\" (UniqueName: \"kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp\") pod \"heat-cfnapi-5f7cf55b87-zh4pm\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.437605 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.439138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69l57\" (UniqueName: \"kubernetes.io/projected/8effa6a4-dc68-4020-bd47-c83bcdc8d337-kube-api-access-69l57\") pod \"openstackclient\" (UID: \"8effa6a4-dc68-4020-bd47-c83bcdc8d337\") " pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.447941 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515042 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515147 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9j7\" (UniqueName: \"kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515168 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515188 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515220 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515259 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515283 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9ht\" (UniqueName: \"kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515326 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.515380 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.516513 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.517291 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.517386 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.518090 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.518177 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.521608 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.521750 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.533432 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.535348 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9j7\" (UniqueName: \"kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7\") pod \"dnsmasq-dns-7d978555f9-pzvb2\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.535349 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9ht\" (UniqueName: \"kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht\") pod \"heat-api-58c77cfcc9-n2qk9\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.628724 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.684499 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.694982 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:09 crc kubenswrapper[4759]: I1205 00:47:09.991576 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:10 crc kubenswrapper[4759]: W1205 00:47:10.086907 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8effa6a4_dc68_4020_bd47_c83bcdc8d337.slice/crio-761d64e9d1050c0142e5fdaecb6dfa016bad2d5013cde7324615ac76109983be WatchSource:0}: Error finding container 761d64e9d1050c0142e5fdaecb6dfa016bad2d5013cde7324615ac76109983be: Status 404 returned error can't find the container with id 761d64e9d1050c0142e5fdaecb6dfa016bad2d5013cde7324615ac76109983be Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.089760 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.248790 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54dff77467-gvp7s" event={"ID":"3fa7030d-e94e-4590-900a-8d5d41398af8","Type":"ContainerStarted","Data":"186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5"} Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.249075 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54dff77467-gvp7s" event={"ID":"3fa7030d-e94e-4590-900a-8d5d41398af8","Type":"ContainerStarted","Data":"836354ab0de0e580c3714555344defc0b597a99581b63a1873d65d0db13e9179"} Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.249437 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.252849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8effa6a4-dc68-4020-bd47-c83bcdc8d337","Type":"ContainerStarted","Data":"761d64e9d1050c0142e5fdaecb6dfa016bad2d5013cde7324615ac76109983be"} Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.261777 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.398083 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-54dff77467-gvp7s" podStartSLOduration=1.3980644469999999 podStartE2EDuration="1.398064447s" podCreationTimestamp="2025-12-05 00:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:10.272481101 +0000 UTC m=+1449.488142051" watchObservedRunningTime="2025-12-05 00:47:10.398064447 +0000 UTC m=+1449.613725407" Dec 05 00:47:10 crc kubenswrapper[4759]: W1205 00:47:10.402441 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3faf50_9e57_4e57_ba3a_f8b3b2a88b0b.slice/crio-9157554f119a9ae5048111ea08e515054854a9439f98d11cf915557f82307a50 WatchSource:0}: Error finding container 9157554f119a9ae5048111ea08e515054854a9439f98d11cf915557f82307a50: Status 404 returned error can't find the container with id 9157554f119a9ae5048111ea08e515054854a9439f98d11cf915557f82307a50 Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.411650 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:10 crc kubenswrapper[4759]: I1205 00:47:10.426283 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:47:11 crc kubenswrapper[4759]: I1205 00:47:11.290518 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" event={"ID":"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9","Type":"ContainerStarted","Data":"04e17384fa398e5ee4b5b323c988482aa3bf2a0d6db2826bff02418b767ceb8a"} Dec 05 00:47:11 crc kubenswrapper[4759]: I1205 00:47:11.332543 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c77cfcc9-n2qk9" event={"ID":"b91072a1-a45b-4c34-9093-485cd9431151","Type":"ContainerStarted","Data":"52880038e3a900e5d71b92191d33abf74f89d8521b2a7d4b28e8859d392942a4"} Dec 05 00:47:11 crc kubenswrapper[4759]: I1205 00:47:11.362781 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerID="b28b2c6a679dc5b38657a02d51c8835bf2f32d1902fe3a2efb704bbd5848a814" exitCode=0 Dec 05 00:47:11 crc kubenswrapper[4759]: I1205 00:47:11.364199 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" event={"ID":"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b","Type":"ContainerDied","Data":"b28b2c6a679dc5b38657a02d51c8835bf2f32d1902fe3a2efb704bbd5848a814"} Dec 05 00:47:11 crc kubenswrapper[4759]: I1205 00:47:11.364231 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" event={"ID":"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b","Type":"ContainerStarted","Data":"9157554f119a9ae5048111ea08e515054854a9439f98d11cf915557f82307a50"} Dec 05 00:47:12 crc kubenswrapper[4759]: I1205 00:47:12.032370 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 00:47:12 crc kubenswrapper[4759]: I1205 00:47:12.374474 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" event={"ID":"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b","Type":"ContainerStarted","Data":"66e366d87ac58c7498e948a4c3245518cd0f291399d2d92d946b5597530cb083"} Dec 05 00:47:12 crc kubenswrapper[4759]: I1205 00:47:12.374609 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:12 crc kubenswrapper[4759]: I1205 00:47:12.390998 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" podStartSLOduration=3.390799211 podStartE2EDuration="3.390799211s" podCreationTimestamp="2025-12-05 00:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:12.390189036 +0000 UTC m=+1451.605850006" watchObservedRunningTime="2025-12-05 00:47:12.390799211 +0000 UTC m=+1451.606460151" Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.402246 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" event={"ID":"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9","Type":"ContainerStarted","Data":"5062975f5fbd1d8ecc67ec671e81a561b8420b5c874daf0d0ab5ee9e33eb7d56"} Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.404224 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.406122 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c77cfcc9-n2qk9" event={"ID":"b91072a1-a45b-4c34-9093-485cd9431151","Type":"ContainerStarted","Data":"b773efcbc7a001256dffef8e949baa9570a294485c97dbcd48cbb44d8bbea96e"} Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.406384 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.427681 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" podStartSLOduration=2.303131709 podStartE2EDuration="5.427656662s" podCreationTimestamp="2025-12-05 00:47:09 +0000 UTC" firstStartedPulling="2025-12-05 00:47:10.24788683 +0000 UTC m=+1449.463547770" lastFinishedPulling="2025-12-05 00:47:13.372411753 +0000 UTC m=+1452.588072723" observedRunningTime="2025-12-05 00:47:14.419000701 +0000 UTC m=+1453.634661651" watchObservedRunningTime="2025-12-05 00:47:14.427656662 +0000 UTC m=+1453.643317612" Dec 05 00:47:14 crc kubenswrapper[4759]: I1205 00:47:14.443452 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-58c77cfcc9-n2qk9" podStartSLOduration=2.460699128 podStartE2EDuration="5.443433838s" podCreationTimestamp="2025-12-05 00:47:09 +0000 UTC" firstStartedPulling="2025-12-05 00:47:10.394740496 +0000 UTC m=+1449.610401446" lastFinishedPulling="2025-12-05 00:47:13.377475206 +0000 UTC m=+1452.593136156" observedRunningTime="2025-12-05 00:47:14.437732319 +0000 UTC m=+1453.653393269" watchObservedRunningTime="2025-12-05 00:47:14.443433838 +0000 UTC m=+1453.659094788" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.143182 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.143472 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-central-agent" containerID="cri-o://7c4ae836e5a36251b9a4ecfdc0b6b4741d86453eb944cbd1e5c74bbb8775d24e" gracePeriod=30 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.143520 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="proxy-httpd" containerID="cri-o://6ef9020881c3b4cfd947de38180d29239144d6a23f1cf11d71ca87dc9decd683" gracePeriod=30 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.143548 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="sg-core" containerID="cri-o://77599f234268d4d8e87fbe877fa628451eeaeec3f0eeab413d118551fdd81ccb" gracePeriod=30 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.143583 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-notification-agent" containerID="cri-o://72737862ad4a55821e89a60dda0106661dc8f94b543fca01302e7b62aee0db14" gracePeriod=30 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.154092 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.239049 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-8b7bf4bd7-qq45k"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.242225 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8b7bf4bd7-qq45k"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.242337 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.248378 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.248479 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.249201 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374803 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-log-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374859 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-combined-ca-bundle\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374890 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-public-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374917 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-internal-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374954 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdnq\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-kube-api-access-bgdnq\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.374986 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-config-data\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.375015 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-run-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.375069 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-etc-swift\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.429670 4759 generic.go:334] "Generic (PLEG): container finished" podID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerID="6ef9020881c3b4cfd947de38180d29239144d6a23f1cf11d71ca87dc9decd683" exitCode=0 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.429757 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerDied","Data":"6ef9020881c3b4cfd947de38180d29239144d6a23f1cf11d71ca87dc9decd683"} Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.429806 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerDied","Data":"77599f234268d4d8e87fbe877fa628451eeaeec3f0eeab413d118551fdd81ccb"} Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.429771 4759 generic.go:334] "Generic (PLEG): container finished" podID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerID="77599f234268d4d8e87fbe877fa628451eeaeec3f0eeab413d118551fdd81ccb" exitCode=2 Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476365 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdnq\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-kube-api-access-bgdnq\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476799 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-config-data\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476831 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-run-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476884 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-etc-swift\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476927 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-log-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476957 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-combined-ca-bundle\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.476982 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-public-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.477008 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-internal-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.478189 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-log-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.479776 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f625a19c-a9af-401d-a834-37a79e3dfeb4-run-httpd\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.489997 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-public-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.490458 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-config-data\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.490462 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-combined-ca-bundle\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.490746 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-etc-swift\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.491648 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f625a19c-a9af-401d-a834-37a79e3dfeb4-internal-tls-certs\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.494763 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdnq\" (UniqueName: \"kubernetes.io/projected/f625a19c-a9af-401d-a834-37a79e3dfeb4-kube-api-access-bgdnq\") pod \"swift-proxy-8b7bf4bd7-qq45k\" (UID: \"f625a19c-a9af-401d-a834-37a79e3dfeb4\") " pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.594464 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.596745 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.618621 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.622487 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.639824 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.665637 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.690003 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.735583 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.737833 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.763975 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.793790 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794625 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794716 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794734 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb595\" (UniqueName: \"kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794849 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.794884 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mtm\" (UniqueName: \"kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896565 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896614 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896671 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896690 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896710 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb595\" (UniqueName: \"kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896770 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896788 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896809 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896827 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mtm\" (UniqueName: \"kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896867 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n64b\" (UniqueName: \"kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.896888 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.904394 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.905369 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.908975 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.916903 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.917843 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.919141 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.919679 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mtm\" (UniqueName: \"kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm\") pod \"heat-engine-59cf789667-jg8vv\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:15 crc kubenswrapper[4759]: I1205 00:47:15.927039 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb595\" (UniqueName: \"kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595\") pod \"heat-api-67c656b6cc-qmtxr\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:15.999613 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:15.999999 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.000053 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n64b\" (UniqueName: \"kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.000133 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.004128 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.004144 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.005092 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.019803 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.028326 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n64b\" (UniqueName: \"kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b\") pod \"heat-cfnapi-859ddbff6d-twzqf\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.059018 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.080452 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.297816 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8b7bf4bd7-qq45k"] Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.446100 4759 generic.go:334] "Generic (PLEG): container finished" podID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerID="7c4ae836e5a36251b9a4ecfdc0b6b4741d86453eb944cbd1e5c74bbb8775d24e" exitCode=0 Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.446179 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerDied","Data":"7c4ae836e5a36251b9a4ecfdc0b6b4741d86453eb944cbd1e5c74bbb8775d24e"} Dec 05 00:47:16 crc kubenswrapper[4759]: I1205 00:47:16.478621 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.592527 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.593075 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-58c77cfcc9-n2qk9" podUID="b91072a1-a45b-4c34-9093-485cd9431151" containerName="heat-api" containerID="cri-o://b773efcbc7a001256dffef8e949baa9570a294485c97dbcd48cbb44d8bbea96e" gracePeriod=60 Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.606382 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.606645 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerName="heat-cfnapi" containerID="cri-o://5062975f5fbd1d8ecc67ec671e81a561b8420b5c874daf0d0ab5ee9e33eb7d56" gracePeriod=60 Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.627532 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.629088 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.633284 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.637346 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.647957 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.666347 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.667635 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.669292 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.669918 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.673537 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761263 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761322 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7rrm\" (UniqueName: \"kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761354 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761405 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761444 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761460 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761503 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761556 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761573 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761591 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761607 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcch\" (UniqueName: \"kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.761623 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863124 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7rrm\" (UniqueName: \"kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863190 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863343 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863365 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863429 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863510 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863534 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863562 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863586 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcch\" (UniqueName: \"kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863606 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.863645 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.875944 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.883265 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.883416 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.883642 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.883867 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.884203 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.885320 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.889501 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.892820 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7rrm\" (UniqueName: \"kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm\") pod \"heat-api-5f8fdfbc8b-gjmcr\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.893244 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.893574 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:17 crc kubenswrapper[4759]: I1205 00:47:17.905169 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcch\" (UniqueName: \"kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch\") pod \"heat-cfnapi-dc865dc89-pdskl\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.005840 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.012071 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.471492 4759 generic.go:334] "Generic (PLEG): container finished" podID="b91072a1-a45b-4c34-9093-485cd9431151" containerID="b773efcbc7a001256dffef8e949baa9570a294485c97dbcd48cbb44d8bbea96e" exitCode=0 Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.471555 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c77cfcc9-n2qk9" event={"ID":"b91072a1-a45b-4c34-9093-485cd9431151","Type":"ContainerDied","Data":"b773efcbc7a001256dffef8e949baa9570a294485c97dbcd48cbb44d8bbea96e"} Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.473221 4759 generic.go:334] "Generic (PLEG): container finished" podID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerID="5062975f5fbd1d8ecc67ec671e81a561b8420b5c874daf0d0ab5ee9e33eb7d56" exitCode=0 Dec 05 00:47:18 crc kubenswrapper[4759]: I1205 00:47:18.473243 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" event={"ID":"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9","Type":"ContainerDied","Data":"5062975f5fbd1d8ecc67ec671e81a561b8420b5c874daf0d0ab5ee9e33eb7d56"} Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.485569 4759 generic.go:334] "Generic (PLEG): container finished" podID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerID="72737862ad4a55821e89a60dda0106661dc8f94b543fca01302e7b62aee0db14" exitCode=0 Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.485850 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerDied","Data":"72737862ad4a55821e89a60dda0106661dc8f94b543fca01302e7b62aee0db14"} Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.631624 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.193:8000/healthcheck\": dial tcp 10.217.0.193:8000: connect: connection refused" Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.693870 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-58c77cfcc9-n2qk9" podUID="b91072a1-a45b-4c34-9093-485cd9431151" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.195:8004/healthcheck\": dial tcp 10.217.0.195:8004: connect: connection refused" Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.696463 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.780153 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:47:19 crc kubenswrapper[4759]: I1205 00:47:19.781758 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="dnsmasq-dns" containerID="cri-o://028fec5be9d31a23684828f919f971187ab18b6976eb450551fce05317e6858a" gracePeriod=10 Dec 05 00:47:20 crc kubenswrapper[4759]: I1205 00:47:20.114093 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: connect: connection refused" Dec 05 00:47:20 crc kubenswrapper[4759]: I1205 00:47:20.503443 4759 generic.go:334] "Generic (PLEG): container finished" podID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerID="028fec5be9d31a23684828f919f971187ab18b6976eb450551fce05317e6858a" exitCode=0 Dec 05 00:47:20 crc kubenswrapper[4759]: I1205 00:47:20.503487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" event={"ID":"50bae1fb-68c4-48a8-a768-d550bc43aa48","Type":"ContainerDied","Data":"028fec5be9d31a23684828f919f971187ab18b6976eb450551fce05317e6858a"} Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.231535 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.231701 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" containerName="kube-state-metrics" containerID="cri-o://90b06ff3bb1e6b7b8ead1354dc89e8b047dfe53dd2df7c4e64a075f5542271ad" gracePeriod=30 Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.338943 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.339399 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="242736a8-8641-4915-b5cb-e271ce361e3a" containerName="mysqld-exporter" containerID="cri-o://0a1a4e5c3bc1eca49354588cfc7344eb89d87c0a7aade8c9f4ff3a461fd2d7d9" gracePeriod=30 Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.377815 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.189:3000/\": dial tcp 10.217.0.189:3000: connect: connection refused" Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.513179 4759 generic.go:334] "Generic (PLEG): container finished" podID="9fdfe467-6f94-4139-b67a-73d6b69e7753" containerID="90b06ff3bb1e6b7b8ead1354dc89e8b047dfe53dd2df7c4e64a075f5542271ad" exitCode=2 Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.513251 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fdfe467-6f94-4139-b67a-73d6b69e7753","Type":"ContainerDied","Data":"90b06ff3bb1e6b7b8ead1354dc89e8b047dfe53dd2df7c4e64a075f5542271ad"} Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.518487 4759 generic.go:334] "Generic (PLEG): container finished" podID="242736a8-8641-4915-b5cb-e271ce361e3a" containerID="0a1a4e5c3bc1eca49354588cfc7344eb89d87c0a7aade8c9f4ff3a461fd2d7d9" exitCode=2 Dec 05 00:47:21 crc kubenswrapper[4759]: I1205 00:47:21.518531 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"242736a8-8641-4915-b5cb-e271ce361e3a","Type":"ContainerDied","Data":"0a1a4e5c3bc1eca49354588cfc7344eb89d87c0a7aade8c9f4ff3a461fd2d7d9"} Dec 05 00:47:22 crc kubenswrapper[4759]: W1205 00:47:22.916561 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b2799ca_c468_46f0_8e2a_689e6d6bf81b.slice/crio-bcc525d0720f203e0370dc208caf2b56cc1aaf6fdf0a0087b0cc8b623b25561b WatchSource:0}: Error finding container bcc525d0720f203e0370dc208caf2b56cc1aaf6fdf0a0087b0cc8b623b25561b: Status 404 returned error can't find the container with id bcc525d0720f203e0370dc208caf2b56cc1aaf6fdf0a0087b0cc8b623b25561b Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.598179 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" event={"ID":"f625a19c-a9af-401d-a834-37a79e3dfeb4","Type":"ContainerStarted","Data":"e5628f0498671b28ac8f923d27b7d51964bcce1c0b5fe5f4d93bd5c3b47e8d39"} Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.601666 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59cf789667-jg8vv" event={"ID":"8b2799ca-c468-46f0-8e2a-689e6d6bf81b","Type":"ContainerStarted","Data":"bcc525d0720f203e0370dc208caf2b56cc1aaf6fdf0a0087b0cc8b623b25561b"} Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.722830 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.911159 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom\") pod \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.911883 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data\") pod \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.912094 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xsp\" (UniqueName: \"kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp\") pod \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.912213 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle\") pod \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\" (UID: \"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9\") " Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.941028 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp" (OuterVolumeSpecName: "kube-api-access-s8xsp") pod "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" (UID: "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9"). InnerVolumeSpecName "kube-api-access-s8xsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:23 crc kubenswrapper[4759]: I1205 00:47:23.941085 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" (UID: "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.008655 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" (UID: "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.016278 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xsp\" (UniqueName: \"kubernetes.io/projected/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-kube-api-access-s8xsp\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.016319 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.016329 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.081518 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data" (OuterVolumeSpecName: "config-data") pod "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" (UID: "1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.118171 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.309765 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.324625 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8mn7\" (UniqueName: \"kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7\") pod \"9fdfe467-6f94-4139-b67a-73d6b69e7753\" (UID: \"9fdfe467-6f94-4139-b67a-73d6b69e7753\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.346482 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7" (OuterVolumeSpecName: "kube-api-access-h8mn7") pod "9fdfe467-6f94-4139-b67a-73d6b69e7753" (UID: "9fdfe467-6f94-4139-b67a-73d6b69e7753"). InnerVolumeSpecName "kube-api-access-h8mn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.356762 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.358411 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.383711 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.422298 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.428342 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8mn7\" (UniqueName: \"kubernetes.io/projected/9fdfe467-6f94-4139-b67a-73d6b69e7753-kube-api-access-h8mn7\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.526383 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529321 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data\") pod \"b91072a1-a45b-4c34-9093-485cd9431151\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529364 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529390 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529419 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529469 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529509 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529539 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529567 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom\") pod \"b91072a1-a45b-4c34-9093-485cd9431151\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529624 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529659 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle\") pod \"b91072a1-a45b-4c34-9093-485cd9431151\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529679 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9ht\" (UniqueName: \"kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht\") pod \"b91072a1-a45b-4c34-9093-485cd9431151\" (UID: \"b91072a1-a45b-4c34-9093-485cd9431151\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529695 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data\") pod \"242736a8-8641-4915-b5cb-e271ce361e3a\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529713 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529741 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw7wq\" (UniqueName: \"kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529757 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle\") pod \"242736a8-8641-4915-b5cb-e271ce361e3a\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529790 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blp96\" (UniqueName: \"kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96\") pod \"242736a8-8641-4915-b5cb-e271ce361e3a\" (UID: \"242736a8-8641-4915-b5cb-e271ce361e3a\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529809 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts\") pod \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\" (UID: \"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529836 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529852 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9dh\" (UniqueName: \"kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.529907 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb\") pod \"50bae1fb-68c4-48a8-a768-d550bc43aa48\" (UID: \"50bae1fb-68c4-48a8-a768-d550bc43aa48\") " Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.539090 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.558153 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.576556 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.584241 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts" (OuterVolumeSpecName: "scripts") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.588131 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b91072a1-a45b-4c34-9093-485cd9431151" (UID: "b91072a1-a45b-4c34-9093-485cd9431151"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.588493 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht" (OuterVolumeSpecName: "kube-api-access-rr9ht") pod "b91072a1-a45b-4c34-9093-485cd9431151" (UID: "b91072a1-a45b-4c34-9093-485cd9431151"). InnerVolumeSpecName "kube-api-access-rr9ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.592719 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq" (OuterVolumeSpecName: "kube-api-access-tw7wq") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "kube-api-access-tw7wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.599532 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh" (OuterVolumeSpecName: "kube-api-access-7t9dh") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "kube-api-access-7t9dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.604413 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96" (OuterVolumeSpecName: "kube-api-access-blp96") pod "242736a8-8641-4915-b5cb-e271ce361e3a" (UID: "242736a8-8641-4915-b5cb-e271ce361e3a"). InnerVolumeSpecName "kube-api-access-blp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.607626 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:24 crc kubenswrapper[4759]: W1205 00:47:24.621581 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod466a3f6b_d457_4d8f_9aa1_33332ebbb5da.slice/crio-5156c8612a077d2cea82817c1842d32b3f5ce993672ffb0c1b40e9e2b579e0aa WatchSource:0}: Error finding container 5156c8612a077d2cea82817c1842d32b3f5ce993672ffb0c1b40e9e2b579e0aa: Status 404 returned error can't find the container with id 5156c8612a077d2cea82817c1842d32b3f5ce993672ffb0c1b40e9e2b579e0aa Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.627185 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" event={"ID":"f625a19c-a9af-401d-a834-37a79e3dfeb4","Type":"ContainerStarted","Data":"6610b48f74aa7891dd994c8741d24fd9d5072e9a581bd01fd398ce49ecd06a51"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632210 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632234 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9ht\" (UniqueName: \"kubernetes.io/projected/b91072a1-a45b-4c34-9093-485cd9431151-kube-api-access-rr9ht\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632245 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632260 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw7wq\" (UniqueName: \"kubernetes.io/projected/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-kube-api-access-tw7wq\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632271 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blp96\" (UniqueName: \"kubernetes.io/projected/242736a8-8641-4915-b5cb-e271ce361e3a-kube-api-access-blp96\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632279 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632288 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9dh\" (UniqueName: \"kubernetes.io/projected/50bae1fb-68c4-48a8-a768-d550bc43aa48-kube-api-access-7t9dh\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.632296 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.635377 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" event={"ID":"1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9","Type":"ContainerDied","Data":"04e17384fa398e5ee4b5b323c988482aa3bf2a0d6db2826bff02418b767ceb8a"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.635428 4759 scope.go:117] "RemoveContainer" containerID="5062975f5fbd1d8ecc67ec671e81a561b8420b5c874daf0d0ab5ee9e33eb7d56" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.636361 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f7cf55b87-zh4pm" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.647151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c77cfcc9-n2qk9" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.647999 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c77cfcc9-n2qk9" event={"ID":"b91072a1-a45b-4c34-9093-485cd9431151","Type":"ContainerDied","Data":"52880038e3a900e5d71b92191d33abf74f89d8521b2a7d4b28e8859d392942a4"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.653895 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" event={"ID":"50bae1fb-68c4-48a8-a768-d550bc43aa48","Type":"ContainerDied","Data":"2dfb0cbb853645a030b0304edc7f9369ae7c59b94af31dbdd6f9ace85650a8f4"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.653987 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-pq529" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.656295 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fdfe467-6f94-4139-b67a-73d6b69e7753","Type":"ContainerDied","Data":"e9dd9604bef12a4a9221d424b12fad29eb2aaad5a9e8521566271ba5c612204e"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.656454 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.663154 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae2c2b7-5a8b-44f0-a474-a95f3d47640c","Type":"ContainerDied","Data":"51b86621748a9c5173f2c6a1b2a88caf457c0ec920c8e6a7a2810d02eccc2ced"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.663227 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.671212 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67c656b6cc-qmtxr" event={"ID":"6c11dd68-b0df-437f-a857-870790e10769","Type":"ContainerStarted","Data":"6107aa5fa064c05b15ae38e37b9dbdd1005e7acfddd0268b714a75ba61f3d48f"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.679588 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"242736a8-8641-4915-b5cb-e271ce361e3a","Type":"ContainerDied","Data":"e35350a0a6712eb8e65936aa23aa68a1eff7c54742d2f82eeaa88d50473194b9"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.679817 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.696861 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59cf789667-jg8vv" event={"ID":"8b2799ca-c468-46f0-8e2a-689e6d6bf81b","Type":"ContainerStarted","Data":"f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da"} Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.699030 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.720888 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.737417 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5f7cf55b87-zh4pm"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.770078 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.779186 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.788668 4759 scope.go:117] "RemoveContainer" containerID="b773efcbc7a001256dffef8e949baa9570a294485c97dbcd48cbb44d8bbea96e" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802191 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802867 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="dnsmasq-dns" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802880 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="dnsmasq-dns" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802899 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91072a1-a45b-4c34-9093-485cd9431151" containerName="heat-api" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802905 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91072a1-a45b-4c34-9093-485cd9431151" containerName="heat-api" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802917 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242736a8-8641-4915-b5cb-e271ce361e3a" containerName="mysqld-exporter" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802947 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="242736a8-8641-4915-b5cb-e271ce361e3a" containerName="mysqld-exporter" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802957 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="sg-core" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802963 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="sg-core" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802976 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="init" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.802982 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="init" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.802994 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="proxy-httpd" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.803000 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="proxy-httpd" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.803033 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-central-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.803040 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-central-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.803055 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerName="heat-cfnapi" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.803060 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerName="heat-cfnapi" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.803071 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" containerName="kube-state-metrics" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.803076 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" containerName="kube-state-metrics" Dec 05 00:47:24 crc kubenswrapper[4759]: E1205 00:47:24.803110 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-notification-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.803759 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-notification-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804070 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="proxy-httpd" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804081 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" containerName="kube-state-metrics" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804256 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" containerName="heat-cfnapi" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804335 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91072a1-a45b-4c34-9093-485cd9431151" containerName="heat-api" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804349 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="242736a8-8641-4915-b5cb-e271ce361e3a" containerName="mysqld-exporter" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804363 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-notification-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804371 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" containerName="dnsmasq-dns" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804377 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="sg-core" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.804415 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" containerName="ceilometer-central-agent" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.805382 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.807403 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.808277 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.822444 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.828049 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-59cf789667-jg8vv" podStartSLOduration=9.828032345 podStartE2EDuration="9.828032345s" podCreationTimestamp="2025-12-05 00:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:24.783572689 +0000 UTC m=+1463.999233639" watchObservedRunningTime="2025-12-05 00:47:24.828032345 +0000 UTC m=+1464.043693295" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.840882 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b91072a1-a45b-4c34-9093-485cd9431151" (UID: "b91072a1-a45b-4c34-9093-485cd9431151"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.842726 4759 scope.go:117] "RemoveContainer" containerID="028fec5be9d31a23684828f919f971187ab18b6976eb450551fce05317e6858a" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.842773 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.842844 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86qd\" (UniqueName: \"kubernetes.io/projected/43572d36-66d8-45df-9976-33f0b1e313f9-kube-api-access-c86qd\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.842943 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.843235 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.843441 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.852758 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.853236 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.889464 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "242736a8-8641-4915-b5cb-e271ce361e3a" (UID: "242736a8-8641-4915-b5cb-e271ce361e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.897115 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.918541 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data" (OuterVolumeSpecName: "config-data") pod "b91072a1-a45b-4c34-9093-485cd9431151" (UID: "b91072a1-a45b-4c34-9093-485cd9431151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.926802 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config" (OuterVolumeSpecName: "config") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.940183 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.941184 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data" (OuterVolumeSpecName: "config-data") pod "242736a8-8641-4915-b5cb-e271ce361e3a" (UID: "242736a8-8641-4915-b5cb-e271ce361e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.951281 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.951508 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.951589 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86qd\" (UniqueName: \"kubernetes.io/projected/43572d36-66d8-45df-9976-33f0b1e313f9-kube-api-access-c86qd\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.951828 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.958097 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960778 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91072a1-a45b-4c34-9093-485cd9431151-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960814 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960834 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960844 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960879 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.960890 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242736a8-8641-4915-b5cb-e271ce361e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.961333 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.964616 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.982152 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86qd\" (UniqueName: \"kubernetes.io/projected/43572d36-66d8-45df-9976-33f0b1e313f9-kube-api-access-c86qd\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.990329 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:24 crc kubenswrapper[4759]: I1205 00:47:24.994194 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43572d36-66d8-45df-9976-33f0b1e313f9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"43572d36-66d8-45df-9976-33f0b1e313f9\") " pod="openstack/kube-state-metrics-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:24.998910 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.001461 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50bae1fb-68c4-48a8-a768-d550bc43aa48" (UID: "50bae1fb-68c4-48a8-a768-d550bc43aa48"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.066725 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.066755 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.066764 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50bae1fb-68c4-48a8-a768-d550bc43aa48-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.135720 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data" (OuterVolumeSpecName: "config-data") pod "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" (UID: "2ae2c2b7-5a8b-44f0-a474-a95f3d47640c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.147441 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.168373 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.175675 4759 scope.go:117] "RemoveContainer" containerID="36221485da31ece5df6e42705967fb5874d4ba4c9c4a9a04d04312a3f08298c2" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.193269 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9" path="/var/lib/kubelet/pods/1e58ecc1-3de3-4b48-a51d-b3d6aa24c8c9/volumes" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.198783 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fdfe467-6f94-4139-b67a-73d6b69e7753" path="/var/lib/kubelet/pods/9fdfe467-6f94-4139-b67a-73d6b69e7753/volumes" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.206282 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.219316 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-58c77cfcc9-n2qk9"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.230148 4759 scope.go:117] "RemoveContainer" containerID="90b06ff3bb1e6b7b8ead1354dc89e8b047dfe53dd2df7c4e64a075f5542271ad" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.444743 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.486939 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.500525 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.501911 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.506021 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.506182 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.515430 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.525936 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.536731 4759 scope.go:117] "RemoveContainer" containerID="6ef9020881c3b4cfd947de38180d29239144d6a23f1cf11d71ca87dc9decd683" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.537628 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.554497 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.563424 4759 scope.go:117] "RemoveContainer" containerID="77599f234268d4d8e87fbe877fa628451eeaeec3f0eeab413d118551fdd81ccb" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.566972 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-pq529"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.582065 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.584995 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.587073 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.587730 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.587943 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.590210 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.590270 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.590334 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsggm\" (UniqueName: \"kubernetes.io/projected/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-kube-api-access-jsggm\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.590375 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.601568 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.616498 4759 scope.go:117] "RemoveContainer" containerID="72737862ad4a55821e89a60dda0106661dc8f94b543fca01302e7b62aee0db14" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.654384 4759 scope.go:117] "RemoveContainer" containerID="7c4ae836e5a36251b9a4ecfdc0b6b4741d86453eb944cbd1e5c74bbb8775d24e" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.681370 4759 scope.go:117] "RemoveContainer" containerID="0a1a4e5c3bc1eca49354588cfc7344eb89d87c0a7aade8c9f4ff3a461fd2d7d9" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754157 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754198 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754240 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754256 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754279 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754336 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsggm\" (UniqueName: \"kubernetes.io/projected/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-kube-api-access-jsggm\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754365 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754389 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754416 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754438 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754483 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.754500 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vzr\" (UniqueName: \"kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.761064 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-config-data\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.763855 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.764555 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.783887 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsggm\" (UniqueName: \"kubernetes.io/projected/ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2-kube-api-access-jsggm\") pod \"mysqld-exporter-0\" (UID: \"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2\") " pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.793867 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" event={"ID":"466a3f6b-d457-4d8f-9aa1-33332ebbb5da","Type":"ContainerStarted","Data":"cd3a3dfd7fc59e402cad9adc2d3b1b4022e617b64dedf9e6dfe8c29e3cfdeb8f"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.793908 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" event={"ID":"466a3f6b-d457-4d8f-9aa1-33332ebbb5da","Type":"ContainerStarted","Data":"5156c8612a077d2cea82817c1842d32b3f5ce993672ffb0c1b40e9e2b579e0aa"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.794970 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.802887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8effa6a4-dc68-4020-bd47-c83bcdc8d337","Type":"ContainerStarted","Data":"b683e54209f5bc04c6cd6536f4fc64dd2bf7e9b784a72ec231dd591ac7fc9b2e"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.804459 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" event={"ID":"dbb17f1f-6160-49fd-83c8-c4c04dbd1665","Type":"ContainerStarted","Data":"7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.804496 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" event={"ID":"dbb17f1f-6160-49fd-83c8-c4c04dbd1665","Type":"ContainerStarted","Data":"31de40bb0a089b88af2faa579ad6d44c3cd9e365c5a860d17eced0e7a5becf95"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.804926 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.819157 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" event={"ID":"f625a19c-a9af-401d-a834-37a79e3dfeb4","Type":"ContainerStarted","Data":"f2d7516521fd676bd6fa330001886d48bf8ead3cdd7c822fa13ae80eff2d211a"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.819271 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.819358 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.826691 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.845585 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.849501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc865dc89-pdskl" event={"ID":"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e","Type":"ContainerStarted","Data":"5edca824b8e8174871bb63690d372a2cbd59799f6282f923bdd89815c9b81820"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.849543 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc865dc89-pdskl" event={"ID":"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e","Type":"ContainerStarted","Data":"71f0dd746e11ea151ad232a364d7d88dd90f9d6f58f6d389f862ed81e81d6d2b"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.851107 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.856462 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" podStartSLOduration=8.851295413 podStartE2EDuration="8.851295413s" podCreationTimestamp="2025-12-05 00:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:25.813362517 +0000 UTC m=+1465.029023467" watchObservedRunningTime="2025-12-05 00:47:25.851295413 +0000 UTC m=+1465.066956363" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.858822 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.088256194 podStartE2EDuration="16.858812378s" podCreationTimestamp="2025-12-05 00:47:09 +0000 UTC" firstStartedPulling="2025-12-05 00:47:10.089594434 +0000 UTC m=+1449.305255374" lastFinishedPulling="2025-12-05 00:47:23.860150608 +0000 UTC m=+1463.075811558" observedRunningTime="2025-12-05 00:47:25.843373711 +0000 UTC m=+1465.059034661" watchObservedRunningTime="2025-12-05 00:47:25.858812378 +0000 UTC m=+1465.074473328" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.861456 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.862490 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.862561 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.862771 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.862841 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.862883 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.865158 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.865221 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vzr\" (UniqueName: \"kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.866065 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.868431 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.872950 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.875410 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.878441 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.879095 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.882825 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.890421 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" podStartSLOduration=10.890407588 podStartE2EDuration="10.890407588s" podCreationTimestamp="2025-12-05 00:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:25.868896064 +0000 UTC m=+1465.084557014" watchObservedRunningTime="2025-12-05 00:47:25.890407588 +0000 UTC m=+1465.106068538" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.896926 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-dc865dc89-pdskl" podStartSLOduration=8.896910168 podStartE2EDuration="8.896910168s" podCreationTimestamp="2025-12-05 00:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:25.894263343 +0000 UTC m=+1465.109924293" watchObservedRunningTime="2025-12-05 00:47:25.896910168 +0000 UTC m=+1465.112571118" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.898589 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vzr\" (UniqueName: \"kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr\") pod \"ceilometer-0\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.907871 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67c656b6cc-qmtxr" event={"ID":"6c11dd68-b0df-437f-a857-870790e10769","Type":"ContainerStarted","Data":"252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066"} Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.907912 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.909817 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.940399 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" podStartSLOduration=10.940378479 podStartE2EDuration="10.940378479s" podCreationTimestamp="2025-12-05 00:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:25.937665583 +0000 UTC m=+1465.153326533" watchObservedRunningTime="2025-12-05 00:47:25.940378479 +0000 UTC m=+1465.156039429" Dec 05 00:47:25 crc kubenswrapper[4759]: I1205 00:47:25.976591 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-67c656b6cc-qmtxr" podStartSLOduration=10.976571643 podStartE2EDuration="10.976571643s" podCreationTimestamp="2025-12-05 00:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:25.965673377 +0000 UTC m=+1465.181334327" watchObservedRunningTime="2025-12-05 00:47:25.976571643 +0000 UTC m=+1465.192232593" Dec 05 00:47:26 crc kubenswrapper[4759]: E1205 00:47:26.607539 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb17f1f_6160_49fd_83c8_c4c04dbd1665.slice/crio-conmon-7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c11dd68_b0df_437f_a857_870790e10769.slice/crio-252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb17f1f_6160_49fd_83c8_c4c04dbd1665.slice/crio-7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c11dd68_b0df_437f_a857_870790e10769.slice/crio-conmon-252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.736554 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.892655 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.915990 4759 generic.go:334] "Generic (PLEG): container finished" podID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerID="7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c" exitCode=1 Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.916663 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" event={"ID":"dbb17f1f-6160-49fd-83c8-c4c04dbd1665","Type":"ContainerDied","Data":"7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c"} Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.916809 4759 scope.go:117] "RemoveContainer" containerID="7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c" Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.921041 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2","Type":"ContainerStarted","Data":"8770bdc2bbf05a70bf99d5238a88cb8f727e8fd9e1622a49c938dabef8019c3f"} Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.942565 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"43572d36-66d8-45df-9976-33f0b1e313f9","Type":"ContainerStarted","Data":"e03e58d834dab4217d36a12913193c81e974b81ef4639567def84bf48c07cff5"} Dec 05 00:47:26 crc kubenswrapper[4759]: I1205 00:47:26.972140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerStarted","Data":"d18440e77b26321d4bcef535e0a2e206b990a413ba6a0431b98298ffc0e4d497"} Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.004844 4759 generic.go:334] "Generic (PLEG): container finished" podID="6c11dd68-b0df-437f-a857-870790e10769" containerID="252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066" exitCode=1 Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.008467 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67c656b6cc-qmtxr" event={"ID":"6c11dd68-b0df-437f-a857-870790e10769","Type":"ContainerDied","Data":"252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066"} Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.009133 4759 scope.go:117] "RemoveContainer" containerID="252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.048413 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hsbmj"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.049735 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.068202 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hsbmj"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.315271 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242736a8-8641-4915-b5cb-e271ce361e3a" path="/var/lib/kubelet/pods/242736a8-8641-4915-b5cb-e271ce361e3a/volumes" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.316444 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae2c2b7-5a8b-44f0-a474-a95f3d47640c" path="/var/lib/kubelet/pods/2ae2c2b7-5a8b-44f0-a474-a95f3d47640c/volumes" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.317324 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bae1fb-68c4-48a8-a768-d550bc43aa48" path="/var/lib/kubelet/pods/50bae1fb-68c4-48a8-a768-d550bc43aa48/volumes" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.318694 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91072a1-a45b-4c34-9093-485cd9431151" path="/var/lib/kubelet/pods/b91072a1-a45b-4c34-9093-485cd9431151/volumes" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.319760 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vzjc9"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.321339 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.323684 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vzjc9"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.403087 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crj9k\" (UniqueName: \"kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.403392 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.447578 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f9f6-account-create-update-lq98w"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.450336 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.452853 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f9f6-account-create-update-lq98w"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.455219 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.506592 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.506647 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.506765 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crj9k\" (UniqueName: \"kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.506799 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprs6\" (UniqueName: \"kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.507600 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.518132 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-h6trw"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.519539 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.528200 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bf22-account-create-update-24szb"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.530714 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.533945 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.565337 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crj9k\" (UniqueName: \"kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k\") pod \"nova-api-db-create-hsbmj\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.569592 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-h6trw"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.594557 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.610361 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxtb\" (UniqueName: \"kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.610415 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.610739 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.611075 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc526\" (UniqueName: \"kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.611222 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.611109 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprs6\" (UniqueName: \"kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.611391 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.626801 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf22-account-create-update-24szb"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.626865 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprs6\" (UniqueName: \"kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6\") pod \"nova-cell0-db-create-vzjc9\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718396 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718631 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718738 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc526\" (UniqueName: \"kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718773 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718789 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkvr4\" (UniqueName: \"kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.718851 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxtb\" (UniqueName: \"kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.719750 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.720446 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.737322 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc526\" (UniqueName: \"kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526\") pod \"nova-cell1-db-create-h6trw\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.743583 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxtb\" (UniqueName: \"kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb\") pod \"nova-api-f9f6-account-create-update-lq98w\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.755914 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.791973 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.801657 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-78a8-account-create-update-wwr6v"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.803088 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.806198 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.812782 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-78a8-account-create-update-wwr6v"] Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.820158 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.820289 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkvr4\" (UniqueName: \"kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.822858 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.860899 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkvr4\" (UniqueName: \"kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4\") pod \"nova-cell0-bf22-account-create-update-24szb\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.925834 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xfs\" (UniqueName: \"kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.926338 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.962869 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:27 crc kubenswrapper[4759]: I1205 00:47:27.980875 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.027689 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xfs\" (UniqueName: \"kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.027756 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.034571 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.049749 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xfs\" (UniqueName: \"kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs\") pod \"nova-cell1-78a8-account-create-update-wwr6v\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.112662 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hsbmj"] Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.131031 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.353628 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f9f6-account-create-update-lq98w"] Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.370506 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vzjc9"] Dec 05 00:47:28 crc kubenswrapper[4759]: W1205 00:47:28.378297 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfe7fd9_c621_4fc3_a3ce_cba2143af712.slice/crio-1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b WatchSource:0}: Error finding container 1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b: Status 404 returned error can't find the container with id 1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.687664 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf22-account-create-update-24szb"] Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.857128 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-h6trw"] Dec 05 00:47:28 crc kubenswrapper[4759]: W1205 00:47:28.866984 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018e67eb_28b1_4a1e_8da2_115462fef72a.slice/crio-dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0 WatchSource:0}: Error finding container dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0: Status 404 returned error can't find the container with id dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0 Dec 05 00:47:28 crc kubenswrapper[4759]: W1205 00:47:28.874051 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod710013b3_7fcc_4c39_a383_7361318ae0b6.slice/crio-889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6 WatchSource:0}: Error finding container 889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6: Status 404 returned error can't find the container with id 889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6 Dec 05 00:47:28 crc kubenswrapper[4759]: I1205 00:47:28.909969 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-78a8-account-create-update-wwr6v"] Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.033258 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf22-account-create-update-24szb" event={"ID":"fa8943e8-c599-484c-a821-c983033ed94a","Type":"ContainerStarted","Data":"889daac973015c5549792f0933772b85f851f1b9cb29bbb7ddaa6e3154737634"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.038272 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vzjc9" event={"ID":"b5e822d6-72f5-4222-8a41-4cd64c090c13","Type":"ContainerStarted","Data":"139b46ad2caaeb13207f6bdd7360947abb7115d2dceb8a7d5e27c4f3ec6396b3"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.038324 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vzjc9" event={"ID":"b5e822d6-72f5-4222-8a41-4cd64c090c13","Type":"ContainerStarted","Data":"e7bbce49c65a187a2cd2f7de5f2c375130064373f3ff38148917603c774496d7"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.047996 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" event={"ID":"710013b3-7fcc-4c39-a383-7361318ae0b6","Type":"ContainerStarted","Data":"889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.055403 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h6trw" event={"ID":"018e67eb-28b1-4a1e-8da2-115462fef72a","Type":"ContainerStarted","Data":"dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.059099 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsbmj" event={"ID":"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6","Type":"ContainerStarted","Data":"6e7299083296aa9b09586cbf7338a73dcbc4c02fedc01d10dc3c2eea7c01c6c7"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.059143 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsbmj" event={"ID":"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6","Type":"ContainerStarted","Data":"862d8d3a9a48a87f2b963bbdf8ecd4cbd100abfd89ca8d9225bb1f963e2b0929"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.066107 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f9f6-account-create-update-lq98w" event={"ID":"ebfe7fd9-c621-4fc3-a3ce-cba2143af712","Type":"ContainerStarted","Data":"11e05ea9873d298f6a1fea5b94ea1bfd679a98c041c8e3961c45bca03fd4b558"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.066235 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f9f6-account-create-update-lq98w" event={"ID":"ebfe7fd9-c621-4fc3-a3ce-cba2143af712","Type":"ContainerStarted","Data":"1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b"} Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.084935 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-vzjc9" podStartSLOduration=2.084918051 podStartE2EDuration="2.084918051s" podCreationTimestamp="2025-12-05 00:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:29.081835226 +0000 UTC m=+1468.297496176" watchObservedRunningTime="2025-12-05 00:47:29.084918051 +0000 UTC m=+1468.300579001" Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.098577 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-hsbmj" podStartSLOduration=2.098559044 podStartE2EDuration="2.098559044s" podCreationTimestamp="2025-12-05 00:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:29.097328914 +0000 UTC m=+1468.312989864" watchObservedRunningTime="2025-12-05 00:47:29.098559044 +0000 UTC m=+1468.314219994" Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.160706 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-f9f6-account-create-update-lq98w" podStartSLOduration=2.160683251 podStartE2EDuration="2.160683251s" podCreationTimestamp="2025-12-05 00:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:47:29.148612216 +0000 UTC m=+1468.364273166" watchObservedRunningTime="2025-12-05 00:47:29.160683251 +0000 UTC m=+1468.376344201" Dec 05 00:47:29 crc kubenswrapper[4759]: I1205 00:47:29.522189 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.090129 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"43572d36-66d8-45df-9976-33f0b1e313f9","Type":"ContainerStarted","Data":"97bb098556797c5937350d1ebe47ebdc6997177159dfb502bdf65dc2781f01f5"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.090475 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.092055 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerStarted","Data":"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.094785 4759 generic.go:334] "Generic (PLEG): container finished" podID="710013b3-7fcc-4c39-a383-7361318ae0b6" containerID="777e420008aabd5db54b0bad93712191fce704248b5ef00c54aabeb0e58cce00" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.094876 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" event={"ID":"710013b3-7fcc-4c39-a383-7361318ae0b6","Type":"ContainerDied","Data":"777e420008aabd5db54b0bad93712191fce704248b5ef00c54aabeb0e58cce00"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.096113 4759 generic.go:334] "Generic (PLEG): container finished" podID="018e67eb-28b1-4a1e-8da2-115462fef72a" containerID="503ade21aa166f3707a0a1af5b6c51f42c541655ea19ebc11caeac5b2b38e2dc" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.096230 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h6trw" event={"ID":"018e67eb-28b1-4a1e-8da2-115462fef72a","Type":"ContainerDied","Data":"503ade21aa166f3707a0a1af5b6c51f42c541655ea19ebc11caeac5b2b38e2dc"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.098530 4759 generic.go:334] "Generic (PLEG): container finished" podID="6c11dd68-b0df-437f-a857-870790e10769" containerID="4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8" exitCode=1 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.098635 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67c656b6cc-qmtxr" event={"ID":"6c11dd68-b0df-437f-a857-870790e10769","Type":"ContainerDied","Data":"4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.098699 4759 scope.go:117] "RemoveContainer" containerID="252abaf04b4dc824f4230df618450a5c920ba54a5267f34a0e9deb8af2a6e066" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.100803 4759 scope.go:117] "RemoveContainer" containerID="4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.102710 4759 generic.go:334] "Generic (PLEG): container finished" podID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerID="6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b" exitCode=1 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.102756 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" event={"ID":"dbb17f1f-6160-49fd-83c8-c4c04dbd1665","Type":"ContainerDied","Data":"6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.103335 4759 scope.go:117] "RemoveContainer" containerID="6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b" Dec 05 00:47:30 crc kubenswrapper[4759]: E1205 00:47:30.103587 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-859ddbff6d-twzqf_openstack(dbb17f1f-6160-49fd-83c8-c4c04dbd1665)\"" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.109782 4759 generic.go:334] "Generic (PLEG): container finished" podID="ebfe7fd9-c621-4fc3-a3ce-cba2143af712" containerID="11e05ea9873d298f6a1fea5b94ea1bfd679a98c041c8e3961c45bca03fd4b558" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.109870 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f9f6-account-create-update-lq98w" event={"ID":"ebfe7fd9-c621-4fc3-a3ce-cba2143af712","Type":"ContainerDied","Data":"11e05ea9873d298f6a1fea5b94ea1bfd679a98c041c8e3961c45bca03fd4b558"} Dec 05 00:47:30 crc kubenswrapper[4759]: E1205 00:47:30.111481 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-67c656b6cc-qmtxr_openstack(6c11dd68-b0df-437f-a857-870790e10769)\"" pod="openstack/heat-api-67c656b6cc-qmtxr" podUID="6c11dd68-b0df-437f-a857-870790e10769" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.114106 4759 generic.go:334] "Generic (PLEG): container finished" podID="fa8943e8-c599-484c-a821-c983033ed94a" containerID="d30a68c6629e79dc3a0bef6f425488cc91e9880857d62e70e196b69284dd14cd" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.114165 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf22-account-create-update-24szb" event={"ID":"fa8943e8-c599-484c-a821-c983033ed94a","Type":"ContainerDied","Data":"d30a68c6629e79dc3a0bef6f425488cc91e9880857d62e70e196b69284dd14cd"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.115942 4759 generic.go:334] "Generic (PLEG): container finished" podID="b5e822d6-72f5-4222-8a41-4cd64c090c13" containerID="139b46ad2caaeb13207f6bdd7360947abb7115d2dceb8a7d5e27c4f3ec6396b3" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.115983 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vzjc9" event={"ID":"b5e822d6-72f5-4222-8a41-4cd64c090c13","Type":"ContainerDied","Data":"139b46ad2caaeb13207f6bdd7360947abb7115d2dceb8a7d5e27c4f3ec6396b3"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.118283 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.7390017110000002 podStartE2EDuration="6.118265645s" podCreationTimestamp="2025-12-05 00:47:24 +0000 UTC" firstStartedPulling="2025-12-05 00:47:25.83725749 +0000 UTC m=+1465.052918440" lastFinishedPulling="2025-12-05 00:47:29.216521434 +0000 UTC m=+1468.432182374" observedRunningTime="2025-12-05 00:47:30.110119076 +0000 UTC m=+1469.325780026" watchObservedRunningTime="2025-12-05 00:47:30.118265645 +0000 UTC m=+1469.333926595" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.118648 4759 generic.go:334] "Generic (PLEG): container finished" podID="12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" containerID="6e7299083296aa9b09586cbf7338a73dcbc4c02fedc01d10dc3c2eea7c01c6c7" exitCode=0 Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.118688 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsbmj" event={"ID":"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6","Type":"ContainerDied","Data":"6e7299083296aa9b09586cbf7338a73dcbc4c02fedc01d10dc3c2eea7c01c6c7"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.129150 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2","Type":"ContainerStarted","Data":"a41bc8da95b4f4fe388d1f9f18ea66127c7bf16ee3f842953910cf60eb482716"} Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.170604 4759 scope.go:117] "RemoveContainer" containerID="7efa5898d051e00f6d9f2acbb0b8982e3c05eb4a0a1d5d6a8ec7fd1967e34d6c" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.256270 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.634724245 podStartE2EDuration="5.256254875s" podCreationTimestamp="2025-12-05 00:47:25 +0000 UTC" firstStartedPulling="2025-12-05 00:47:26.727912021 +0000 UTC m=+1465.943572971" lastFinishedPulling="2025-12-05 00:47:29.349442651 +0000 UTC m=+1468.565103601" observedRunningTime="2025-12-05 00:47:30.251284194 +0000 UTC m=+1469.466945144" watchObservedRunningTime="2025-12-05 00:47:30.256254875 +0000 UTC m=+1469.471915825" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.643807 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:30 crc kubenswrapper[4759]: I1205 00:47:30.645207 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.059852 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.060144 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.080663 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.080898 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.144813 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerStarted","Data":"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e"} Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.179248 4759 scope.go:117] "RemoveContainer" containerID="6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b" Dec 05 00:47:31 crc kubenswrapper[4759]: E1205 00:47:31.179713 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-859ddbff6d-twzqf_openstack(dbb17f1f-6160-49fd-83c8-c4c04dbd1665)\"" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.180865 4759 scope.go:117] "RemoveContainer" containerID="4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8" Dec 05 00:47:31 crc kubenswrapper[4759]: E1205 00:47:31.181130 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-67c656b6cc-qmtxr_openstack(6c11dd68-b0df-437f-a857-870790e10769)\"" pod="openstack/heat-api-67c656b6cc-qmtxr" podUID="6c11dd68-b0df-437f-a857-870790e10769" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.280165 4759 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2b80deca-451b-446c-88b9-42c4521b4cc8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2b80deca-451b-446c-88b9-42c4521b4cc8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2b80deca_451b_446c_88b9_42c4521b4cc8.slice" Dec 05 00:47:31 crc kubenswrapper[4759]: E1205 00:47:31.280206 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2b80deca-451b-446c-88b9-42c4521b4cc8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2b80deca-451b-446c-88b9-42c4521b4cc8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2b80deca_451b_446c_88b9_42c4521b4cc8.slice" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.578675 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.761716 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxtb\" (UniqueName: \"kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb\") pod \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.761790 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts\") pod \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\" (UID: \"ebfe7fd9-c621-4fc3-a3ce-cba2143af712\") " Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.762226 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebfe7fd9-c621-4fc3-a3ce-cba2143af712" (UID: "ebfe7fd9-c621-4fc3-a3ce-cba2143af712"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.762334 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.766781 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb" (OuterVolumeSpecName: "kube-api-access-5cxtb") pod "ebfe7fd9-c621-4fc3-a3ce-cba2143af712" (UID: "ebfe7fd9-c621-4fc3-a3ce-cba2143af712"). InnerVolumeSpecName "kube-api-access-5cxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:31 crc kubenswrapper[4759]: I1205 00:47:31.864018 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxtb\" (UniqueName: \"kubernetes.io/projected/ebfe7fd9-c621-4fc3-a3ce-cba2143af712-kube-api-access-5cxtb\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.187384 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.191878 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.193138 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.199152 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.202816 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hsbmj" event={"ID":"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6","Type":"ContainerDied","Data":"862d8d3a9a48a87f2b963bbdf8ecd4cbd100abfd89ca8d9225bb1f963e2b0929"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.202858 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862d8d3a9a48a87f2b963bbdf8ecd4cbd100abfd89ca8d9225bb1f963e2b0929" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.202838 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hsbmj" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.207037 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f9f6-account-create-update-lq98w" event={"ID":"ebfe7fd9-c621-4fc3-a3ce-cba2143af712","Type":"ContainerDied","Data":"1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.207075 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f96fa65cf7cb2bfb87527726e66f24399f49bb09dba0c575fa177c67aa2542b" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.207151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f9f6-account-create-update-lq98w" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.217208 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf22-account-create-update-24szb" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.217615 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf22-account-create-update-24szb" event={"ID":"fa8943e8-c599-484c-a821-c983033ed94a","Type":"ContainerDied","Data":"889daac973015c5549792f0933772b85f851f1b9cb29bbb7ddaa6e3154737634"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.217669 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="889daac973015c5549792f0933772b85f851f1b9cb29bbb7ddaa6e3154737634" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.224945 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vzjc9" event={"ID":"b5e822d6-72f5-4222-8a41-4cd64c090c13","Type":"ContainerDied","Data":"e7bbce49c65a187a2cd2f7de5f2c375130064373f3ff38148917603c774496d7"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.224976 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7bbce49c65a187a2cd2f7de5f2c375130064373f3ff38148917603c774496d7" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.226559 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.229676 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerStarted","Data":"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.231368 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" event={"ID":"710013b3-7fcc-4c39-a383-7361318ae0b6","Type":"ContainerDied","Data":"889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.231396 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="889032ac49a8cf069917fa79e0105d31b9d2ab6bcd8d1d58b160c2fb7b2dc2e6" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.231483 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-78a8-account-create-update-wwr6v" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.236281 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-h6trw" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.236360 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7dd4f9bb-wh6p9" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.238819 4759 scope.go:117] "RemoveContainer" containerID="6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b" Dec 05 00:47:32 crc kubenswrapper[4759]: E1205 00:47:32.239254 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-859ddbff6d-twzqf_openstack(dbb17f1f-6160-49fd-83c8-c4c04dbd1665)\"" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.239334 4759 scope.go:117] "RemoveContainer" containerID="4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8" Dec 05 00:47:32 crc kubenswrapper[4759]: E1205 00:47:32.240255 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-67c656b6cc-qmtxr_openstack(6c11dd68-b0df-437f-a857-870790e10769)\"" pod="openstack/heat-api-67c656b6cc-qmtxr" podUID="6c11dd68-b0df-437f-a857-870790e10769" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.240817 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-h6trw" event={"ID":"018e67eb-28b1-4a1e-8da2-115462fef72a","Type":"ContainerDied","Data":"dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0"} Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.240896 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbebdf42ab501af36a15a623d22a79820602aae3adebb95c881538d2504b3cf0" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.320831 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.329151 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b7dd4f9bb-wh6p9"] Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374226 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkvr4\" (UniqueName: \"kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4\") pod \"fa8943e8-c599-484c-a821-c983033ed94a\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374352 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts\") pod \"018e67eb-28b1-4a1e-8da2-115462fef72a\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374397 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprs6\" (UniqueName: \"kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6\") pod \"b5e822d6-72f5-4222-8a41-4cd64c090c13\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374427 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86xfs\" (UniqueName: \"kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs\") pod \"710013b3-7fcc-4c39-a383-7361318ae0b6\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374452 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts\") pod \"b5e822d6-72f5-4222-8a41-4cd64c090c13\" (UID: \"b5e822d6-72f5-4222-8a41-4cd64c090c13\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374490 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts\") pod \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374538 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts\") pod \"fa8943e8-c599-484c-a821-c983033ed94a\" (UID: \"fa8943e8-c599-484c-a821-c983033ed94a\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374560 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crj9k\" (UniqueName: \"kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k\") pod \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\" (UID: \"12b3ce4d-2b4b-4872-91a5-c353c80d2cb6\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374616 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc526\" (UniqueName: \"kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526\") pod \"018e67eb-28b1-4a1e-8da2-115462fef72a\" (UID: \"018e67eb-28b1-4a1e-8da2-115462fef72a\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.374688 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts\") pod \"710013b3-7fcc-4c39-a383-7361318ae0b6\" (UID: \"710013b3-7fcc-4c39-a383-7361318ae0b6\") " Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.375168 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "018e67eb-28b1-4a1e-8da2-115462fef72a" (UID: "018e67eb-28b1-4a1e-8da2-115462fef72a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.375236 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa8943e8-c599-484c-a821-c983033ed94a" (UID: "fa8943e8-c599-484c-a821-c983033ed94a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.375328 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" (UID: "12b3ce4d-2b4b-4872-91a5-c353c80d2cb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.376289 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "710013b3-7fcc-4c39-a383-7361318ae0b6" (UID: "710013b3-7fcc-4c39-a383-7361318ae0b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.376517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5e822d6-72f5-4222-8a41-4cd64c090c13" (UID: "b5e822d6-72f5-4222-8a41-4cd64c090c13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.380586 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4" (OuterVolumeSpecName: "kube-api-access-pkvr4") pod "fa8943e8-c599-484c-a821-c983033ed94a" (UID: "fa8943e8-c599-484c-a821-c983033ed94a"). InnerVolumeSpecName "kube-api-access-pkvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.380640 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k" (OuterVolumeSpecName: "kube-api-access-crj9k") pod "12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" (UID: "12b3ce4d-2b4b-4872-91a5-c353c80d2cb6"). InnerVolumeSpecName "kube-api-access-crj9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.380663 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526" (OuterVolumeSpecName: "kube-api-access-hc526") pod "018e67eb-28b1-4a1e-8da2-115462fef72a" (UID: "018e67eb-28b1-4a1e-8da2-115462fef72a"). InnerVolumeSpecName "kube-api-access-hc526". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.380678 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6" (OuterVolumeSpecName: "kube-api-access-jprs6") pod "b5e822d6-72f5-4222-8a41-4cd64c090c13" (UID: "b5e822d6-72f5-4222-8a41-4cd64c090c13"). InnerVolumeSpecName "kube-api-access-jprs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.380869 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs" (OuterVolumeSpecName: "kube-api-access-86xfs") pod "710013b3-7fcc-4c39-a383-7361318ae0b6" (UID: "710013b3-7fcc-4c39-a383-7361318ae0b6"). InnerVolumeSpecName "kube-api-access-86xfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477402 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc526\" (UniqueName: \"kubernetes.io/projected/018e67eb-28b1-4a1e-8da2-115462fef72a-kube-api-access-hc526\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477445 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/710013b3-7fcc-4c39-a383-7361318ae0b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477461 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkvr4\" (UniqueName: \"kubernetes.io/projected/fa8943e8-c599-484c-a821-c983033ed94a-kube-api-access-pkvr4\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477473 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/018e67eb-28b1-4a1e-8da2-115462fef72a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477485 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprs6\" (UniqueName: \"kubernetes.io/projected/b5e822d6-72f5-4222-8a41-4cd64c090c13-kube-api-access-jprs6\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477498 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86xfs\" (UniqueName: \"kubernetes.io/projected/710013b3-7fcc-4c39-a383-7361318ae0b6-kube-api-access-86xfs\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477508 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e822d6-72f5-4222-8a41-4cd64c090c13-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477521 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477533 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa8943e8-c599-484c-a821-c983033ed94a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:32 crc kubenswrapper[4759]: I1205 00:47:32.477543 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crj9k\" (UniqueName: \"kubernetes.io/projected/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6-kube-api-access-crj9k\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:33 crc kubenswrapper[4759]: I1205 00:47:33.166624 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b80deca-451b-446c-88b9-42c4521b4cc8" path="/var/lib/kubelet/pods/2b80deca-451b-446c-88b9-42c4521b4cc8/volumes" Dec 05 00:47:33 crc kubenswrapper[4759]: I1205 00:47:33.280485 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vzjc9" Dec 05 00:47:33 crc kubenswrapper[4759]: I1205 00:47:33.280563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerStarted","Data":"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc"} Dec 05 00:47:33 crc kubenswrapper[4759]: I1205 00:47:33.328443 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.783328535 podStartE2EDuration="8.32842712s" podCreationTimestamp="2025-12-05 00:47:25 +0000 UTC" firstStartedPulling="2025-12-05 00:47:26.881566443 +0000 UTC m=+1466.097227393" lastFinishedPulling="2025-12-05 00:47:32.426665028 +0000 UTC m=+1471.642325978" observedRunningTime="2025-12-05 00:47:33.311331661 +0000 UTC m=+1472.526992611" watchObservedRunningTime="2025-12-05 00:47:33.32842712 +0000 UTC m=+1472.544088070" Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.288028 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.433757 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.433817 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.613824 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.626667 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.742770 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:34 crc kubenswrapper[4759]: I1205 00:47:34.797380 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.172622 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.304888 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67c656b6cc-qmtxr" event={"ID":"6c11dd68-b0df-437f-a857-870790e10769","Type":"ContainerDied","Data":"6107aa5fa064c05b15ae38e37b9dbdd1005e7acfddd0268b714a75ba61f3d48f"} Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.304941 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6107aa5fa064c05b15ae38e37b9dbdd1005e7acfddd0268b714a75ba61f3d48f" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.325687 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" event={"ID":"dbb17f1f-6160-49fd-83c8-c4c04dbd1665","Type":"ContainerDied","Data":"31de40bb0a089b88af2faa579ad6d44c3cd9e365c5a860d17eced0e7a5becf95"} Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.325929 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31de40bb0a089b88af2faa579ad6d44c3cd9e365c5a860d17eced0e7a5becf95" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.338846 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.343182 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435117 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom\") pod \"6c11dd68-b0df-437f-a857-870790e10769\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435219 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data\") pod \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435288 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data\") pod \"6c11dd68-b0df-437f-a857-870790e10769\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435396 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom\") pod \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435440 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb595\" (UniqueName: \"kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595\") pod \"6c11dd68-b0df-437f-a857-870790e10769\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435488 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n64b\" (UniqueName: \"kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b\") pod \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435511 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle\") pod \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\" (UID: \"dbb17f1f-6160-49fd-83c8-c4c04dbd1665\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.435562 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle\") pod \"6c11dd68-b0df-437f-a857-870790e10769\" (UID: \"6c11dd68-b0df-437f-a857-870790e10769\") " Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.443999 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c11dd68-b0df-437f-a857-870790e10769" (UID: "6c11dd68-b0df-437f-a857-870790e10769"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.444715 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595" (OuterVolumeSpecName: "kube-api-access-wb595") pod "6c11dd68-b0df-437f-a857-870790e10769" (UID: "6c11dd68-b0df-437f-a857-870790e10769"). InnerVolumeSpecName "kube-api-access-wb595". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.445405 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dbb17f1f-6160-49fd-83c8-c4c04dbd1665" (UID: "dbb17f1f-6160-49fd-83c8-c4c04dbd1665"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.446974 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b" (OuterVolumeSpecName: "kube-api-access-4n64b") pod "dbb17f1f-6160-49fd-83c8-c4c04dbd1665" (UID: "dbb17f1f-6160-49fd-83c8-c4c04dbd1665"). InnerVolumeSpecName "kube-api-access-4n64b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.479903 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbb17f1f-6160-49fd-83c8-c4c04dbd1665" (UID: "dbb17f1f-6160-49fd-83c8-c4c04dbd1665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.489802 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c11dd68-b0df-437f-a857-870790e10769" (UID: "6c11dd68-b0df-437f-a857-870790e10769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.503155 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data" (OuterVolumeSpecName: "config-data") pod "dbb17f1f-6160-49fd-83c8-c4c04dbd1665" (UID: "dbb17f1f-6160-49fd-83c8-c4c04dbd1665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.523000 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data" (OuterVolumeSpecName: "config-data") pod "6c11dd68-b0df-437f-a857-870790e10769" (UID: "6c11dd68-b0df-437f-a857-870790e10769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543097 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543138 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543150 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543166 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb595\" (UniqueName: \"kubernetes.io/projected/6c11dd68-b0df-437f-a857-870790e10769-kube-api-access-wb595\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543178 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543189 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n64b\" (UniqueName: \"kubernetes.io/projected/dbb17f1f-6160-49fd-83c8-c4c04dbd1665-kube-api-access-4n64b\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543200 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:35 crc kubenswrapper[4759]: I1205 00:47:35.543212 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c11dd68-b0df-437f-a857-870790e10769-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.057867 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.109169 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.109410 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-54dff77467-gvp7s" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerName="heat-engine" containerID="cri-o://186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" gracePeriod=60 Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.186088 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.333921 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67c656b6cc-qmtxr" Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.333952 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-859ddbff6d-twzqf" Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.334501 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-central-agent" containerID="cri-o://9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8" gracePeriod=30 Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.334550 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="proxy-httpd" containerID="cri-o://84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc" gracePeriod=30 Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.334565 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="sg-core" containerID="cri-o://e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df" gracePeriod=30 Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.334598 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-notification-agent" containerID="cri-o://1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e" gracePeriod=30 Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.383141 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.392418 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-859ddbff6d-twzqf"] Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.408865 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:36 crc kubenswrapper[4759]: I1205 00:47:36.418693 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-67c656b6cc-qmtxr"] Dec 05 00:47:36 crc kubenswrapper[4759]: E1205 00:47:36.874813 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7c1e1c_cd9d_4cd8_965c_8dd37ad8e629.slice/crio-conmon-1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.165406 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c11dd68-b0df-437f-a857-870790e10769" path="/var/lib/kubelet/pods/6c11dd68-b0df-437f-a857-870790e10769/volumes" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.165960 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" path="/var/lib/kubelet/pods/dbb17f1f-6160-49fd-83c8-c4c04dbd1665/volumes" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345329 4759 generic.go:334] "Generic (PLEG): container finished" podID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerID="84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc" exitCode=0 Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345358 4759 generic.go:334] "Generic (PLEG): container finished" podID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerID="e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df" exitCode=2 Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345367 4759 generic.go:334] "Generic (PLEG): container finished" podID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerID="1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e" exitCode=0 Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345385 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerDied","Data":"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc"} Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345410 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerDied","Data":"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df"} Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.345421 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerDied","Data":"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e"} Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911077 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l4fmq"] Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911717 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911732 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911746 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911752 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911763 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911770 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911788 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710013b3-7fcc-4c39-a383-7361318ae0b6" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911794 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="710013b3-7fcc-4c39-a383-7361318ae0b6" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911813 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911819 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911829 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfe7fd9-c621-4fc3-a3ce-cba2143af712" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911835 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfe7fd9-c621-4fc3-a3ce-cba2143af712" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911845 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e67eb-28b1-4a1e-8da2-115462fef72a" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911850 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e67eb-28b1-4a1e-8da2-115462fef72a" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911863 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e822d6-72f5-4222-8a41-4cd64c090c13" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911869 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e822d6-72f5-4222-8a41-4cd64c090c13" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: E1205 00:47:37.911881 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8943e8-c599-484c-a821-c983033ed94a" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.911886 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8943e8-c599-484c-a821-c983033ed94a" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912049 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e67eb-28b1-4a1e-8da2-115462fef72a" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912065 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912074 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912087 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8943e8-c599-484c-a821-c983033ed94a" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912099 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912108 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb17f1f-6160-49fd-83c8-c4c04dbd1665" containerName="heat-cfnapi" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912115 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfe7fd9-c621-4fc3-a3ce-cba2143af712" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912125 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="710013b3-7fcc-4c39-a383-7361318ae0b6" containerName="mariadb-account-create-update" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912136 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e822d6-72f5-4222-8a41-4cd64c090c13" containerName="mariadb-database-create" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.912785 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.915340 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.915384 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.915478 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rqh9c" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.939475 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l4fmq"] Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.988255 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.988431 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7qq\" (UniqueName: \"kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.988466 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:37 crc kubenswrapper[4759]: I1205 00:47:37.988499 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.091079 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.091229 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7qq\" (UniqueName: \"kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.091265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.091658 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.099565 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.113580 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.114432 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7qq\" (UniqueName: \"kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.117878 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l4fmq\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.232292 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:47:38 crc kubenswrapper[4759]: I1205 00:47:38.807009 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l4fmq"] Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.076987 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215187 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215243 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215295 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215365 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215385 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6vzr\" (UniqueName: \"kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215436 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215501 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.215527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd\") pod \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\" (UID: \"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629\") " Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.216157 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.216525 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.223596 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr" (OuterVolumeSpecName: "kube-api-access-v6vzr") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "kube-api-access-v6vzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.231264 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts" (OuterVolumeSpecName: "scripts") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.268022 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.317763 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.317791 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.317801 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6vzr\" (UniqueName: \"kubernetes.io/projected/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-kube-api-access-v6vzr\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.317809 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.317818 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.336484 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.338418 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.373590 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data" (OuterVolumeSpecName: "config-data") pod "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" (UID: "1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.388258 4759 generic.go:334] "Generic (PLEG): container finished" podID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerID="9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8" exitCode=0 Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.388344 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerDied","Data":"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8"} Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.388369 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.388396 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629","Type":"ContainerDied","Data":"d18440e77b26321d4bcef535e0a2e206b990a413ba6a0431b98298ffc0e4d497"} Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.388415 4759 scope.go:117] "RemoveContainer" containerID="84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.392742 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" event={"ID":"ee78c1b8-dbc7-496b-b757-4433f2764e67","Type":"ContainerStarted","Data":"0fa954f2bd0e61334a5e062469f3788fa1497e9e3b087c99da22c6208e8f7f2b"} Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.420154 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.420193 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.420208 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.449197 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.451438 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.462755 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.462825 4759 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-54dff77467-gvp7s" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerName="heat-engine" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.464592 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.472242 4759 scope.go:117] "RemoveContainer" containerID="e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.486275 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.507043 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.507740 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.507820 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.507894 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-central-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.507945 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-central-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.508019 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="sg-core" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508073 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="sg-core" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.508133 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-notification-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508188 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-notification-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.508257 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="proxy-httpd" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508319 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="proxy-httpd" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508570 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="proxy-httpd" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508648 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c11dd68-b0df-437f-a857-870790e10769" containerName="heat-api" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508708 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-notification-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508770 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="ceilometer-central-agent" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.508829 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" containerName="sg-core" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.511822 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.513653 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.515016 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.517250 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.517548 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.540485 4759 scope.go:117] "RemoveContainer" containerID="1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.569071 4759 scope.go:117] "RemoveContainer" containerID="9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.591879 4759 scope.go:117] "RemoveContainer" containerID="84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.592475 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc\": container with ID starting with 84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc not found: ID does not exist" containerID="84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.592505 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc"} err="failed to get container status \"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc\": rpc error: code = NotFound desc = could not find container \"84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc\": container with ID starting with 84f8d9e08674ae51bcccba5dd1b7bab17f4235d5f474d41c7a32595610fd4bcc not found: ID does not exist" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.592526 4759 scope.go:117] "RemoveContainer" containerID="e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.592769 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df\": container with ID starting with e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df not found: ID does not exist" containerID="e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.592792 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df"} err="failed to get container status \"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df\": rpc error: code = NotFound desc = could not find container \"e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df\": container with ID starting with e19dbd99be87b3510ad9c6920657eb411bbb944fae9ff88832dd857ed2b4e0df not found: ID does not exist" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.592807 4759 scope.go:117] "RemoveContainer" containerID="1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.592981 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e\": container with ID starting with 1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e not found: ID does not exist" containerID="1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.592999 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e"} err="failed to get container status \"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e\": rpc error: code = NotFound desc = could not find container \"1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e\": container with ID starting with 1eb90cd3d9efd93dca052a645386c617867c5c6070c6879eee82d0d545b3315e not found: ID does not exist" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.593009 4759 scope.go:117] "RemoveContainer" containerID="9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8" Dec 05 00:47:39 crc kubenswrapper[4759]: E1205 00:47:39.593355 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8\": container with ID starting with 9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8 not found: ID does not exist" containerID="9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.593378 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8"} err="failed to get container status \"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8\": rpc error: code = NotFound desc = could not find container \"9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8\": container with ID starting with 9ac434afcfdc927104d06aa9a05301b9aca784f807478a7e65e64e7088c415c8 not found: ID does not exist" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636390 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636429 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pdx\" (UniqueName: \"kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636453 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636479 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636533 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636550 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.636585 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.738360 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.738835 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pdx\" (UniqueName: \"kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.738791 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.738905 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.738931 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.739336 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.739363 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.739382 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.739422 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.739754 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.742725 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.743804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.745752 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.746093 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.747934 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.756882 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pdx\" (UniqueName: \"kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx\") pod \"ceilometer-0\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " pod="openstack/ceilometer-0" Dec 05 00:47:39 crc kubenswrapper[4759]: I1205 00:47:39.836016 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:40 crc kubenswrapper[4759]: I1205 00:47:40.334440 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:40 crc kubenswrapper[4759]: W1205 00:47:40.339525 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9435388e_3549_4f3f_b9d8_6889a0594255.slice/crio-b397b6b19499ed0b94c87e234f8f092e0f86c3612eb446379459ea3ec479d613 WatchSource:0}: Error finding container b397b6b19499ed0b94c87e234f8f092e0f86c3612eb446379459ea3ec479d613: Status 404 returned error can't find the container with id b397b6b19499ed0b94c87e234f8f092e0f86c3612eb446379459ea3ec479d613 Dec 05 00:47:40 crc kubenswrapper[4759]: I1205 00:47:40.404120 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerStarted","Data":"b397b6b19499ed0b94c87e234f8f092e0f86c3612eb446379459ea3ec479d613"} Dec 05 00:47:41 crc kubenswrapper[4759]: I1205 00:47:41.177376 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629" path="/var/lib/kubelet/pods/1f7c1e1c-cd9d-4cd8-965c-8dd37ad8e629/volumes" Dec 05 00:47:41 crc kubenswrapper[4759]: I1205 00:47:41.418116 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerStarted","Data":"8cc927ad357f792dd33c54e2a2ae8a00a92cb50b3b0486e79946bee5b5fdd65a"} Dec 05 00:47:42 crc kubenswrapper[4759]: I1205 00:47:42.128118 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:42 crc kubenswrapper[4759]: I1205 00:47:42.429524 4759 generic.go:334] "Generic (PLEG): container finished" podID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerID="186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" exitCode=0 Dec 05 00:47:42 crc kubenswrapper[4759]: I1205 00:47:42.429699 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54dff77467-gvp7s" event={"ID":"3fa7030d-e94e-4590-900a-8d5d41398af8","Type":"ContainerDied","Data":"186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5"} Dec 05 00:47:42 crc kubenswrapper[4759]: I1205 00:47:42.432892 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerStarted","Data":"846c8280b18b24f6b80d5c9b6e6618f62a8869787efbeab2a0c59341823b0cdd"} Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.507345 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54dff77467-gvp7s" event={"ID":"3fa7030d-e94e-4590-900a-8d5d41398af8","Type":"ContainerDied","Data":"836354ab0de0e580c3714555344defc0b597a99581b63a1873d65d0db13e9179"} Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.507705 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="836354ab0de0e580c3714555344defc0b597a99581b63a1873d65d0db13e9179" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.536160 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.621668 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom\") pod \"3fa7030d-e94e-4590-900a-8d5d41398af8\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.621805 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data\") pod \"3fa7030d-e94e-4590-900a-8d5d41398af8\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.622170 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle\") pod \"3fa7030d-e94e-4590-900a-8d5d41398af8\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.622222 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxx2c\" (UniqueName: \"kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c\") pod \"3fa7030d-e94e-4590-900a-8d5d41398af8\" (UID: \"3fa7030d-e94e-4590-900a-8d5d41398af8\") " Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.626832 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c" (OuterVolumeSpecName: "kube-api-access-mxx2c") pod "3fa7030d-e94e-4590-900a-8d5d41398af8" (UID: "3fa7030d-e94e-4590-900a-8d5d41398af8"). InnerVolumeSpecName "kube-api-access-mxx2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.626849 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fa7030d-e94e-4590-900a-8d5d41398af8" (UID: "3fa7030d-e94e-4590-900a-8d5d41398af8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.658832 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fa7030d-e94e-4590-900a-8d5d41398af8" (UID: "3fa7030d-e94e-4590-900a-8d5d41398af8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.670741 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data" (OuterVolumeSpecName: "config-data") pod "3fa7030d-e94e-4590-900a-8d5d41398af8" (UID: "3fa7030d-e94e-4590-900a-8d5d41398af8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.724217 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.724258 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.724271 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa7030d-e94e-4590-900a-8d5d41398af8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:47 crc kubenswrapper[4759]: I1205 00:47:47.724284 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxx2c\" (UniqueName: \"kubernetes.io/projected/3fa7030d-e94e-4590-900a-8d5d41398af8-kube-api-access-mxx2c\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.522143 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerStarted","Data":"09bb029dd4380c2c9d180ab80101f7c297b642fa58a3784851685e001931206b"} Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.523648 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54dff77467-gvp7s" Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.527804 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" event={"ID":"ee78c1b8-dbc7-496b-b757-4433f2764e67","Type":"ContainerStarted","Data":"889920e7c17f4f567f1b7c953223266e6fadf4eebc3eed05977ac09c646d6f2e"} Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.566632 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" podStartSLOduration=3.032859225 podStartE2EDuration="11.566610284s" podCreationTimestamp="2025-12-05 00:47:37 +0000 UTC" firstStartedPulling="2025-12-05 00:47:38.819817962 +0000 UTC m=+1478.035478902" lastFinishedPulling="2025-12-05 00:47:47.353569011 +0000 UTC m=+1486.569229961" observedRunningTime="2025-12-05 00:47:48.548092512 +0000 UTC m=+1487.763753462" watchObservedRunningTime="2025-12-05 00:47:48.566610284 +0000 UTC m=+1487.782271244" Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.579594 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:48 crc kubenswrapper[4759]: I1205 00:47:48.587736 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-54dff77467-gvp7s"] Dec 05 00:47:49 crc kubenswrapper[4759]: I1205 00:47:49.172255 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" path="/var/lib/kubelet/pods/3fa7030d-e94e-4590-900a-8d5d41398af8/volumes" Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.549498 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerStarted","Data":"478b8f4cfdb3ba3bff552bcac55b5f0853cf065f49396c06b441c33b22a5efb6"} Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.549703 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-central-agent" containerID="cri-o://8cc927ad357f792dd33c54e2a2ae8a00a92cb50b3b0486e79946bee5b5fdd65a" gracePeriod=30 Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.549776 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-notification-agent" containerID="cri-o://846c8280b18b24f6b80d5c9b6e6618f62a8869787efbeab2a0c59341823b0cdd" gracePeriod=30 Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.549994 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.549781 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="sg-core" containerID="cri-o://09bb029dd4380c2c9d180ab80101f7c297b642fa58a3784851685e001931206b" gracePeriod=30 Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.550172 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="proxy-httpd" containerID="cri-o://478b8f4cfdb3ba3bff552bcac55b5f0853cf065f49396c06b441c33b22a5efb6" gracePeriod=30 Dec 05 00:47:50 crc kubenswrapper[4759]: I1205 00:47:50.585078 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8445479479999998 podStartE2EDuration="11.585059936s" podCreationTimestamp="2025-12-05 00:47:39 +0000 UTC" firstStartedPulling="2025-12-05 00:47:40.341362229 +0000 UTC m=+1479.557023179" lastFinishedPulling="2025-12-05 00:47:49.081874217 +0000 UTC m=+1488.297535167" observedRunningTime="2025-12-05 00:47:50.577535051 +0000 UTC m=+1489.793196001" watchObservedRunningTime="2025-12-05 00:47:50.585059936 +0000 UTC m=+1489.800720886" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565035 4759 generic.go:334] "Generic (PLEG): container finished" podID="9435388e-3549-4f3f-b9d8-6889a0594255" containerID="478b8f4cfdb3ba3bff552bcac55b5f0853cf065f49396c06b441c33b22a5efb6" exitCode=0 Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565333 4759 generic.go:334] "Generic (PLEG): container finished" podID="9435388e-3549-4f3f-b9d8-6889a0594255" containerID="09bb029dd4380c2c9d180ab80101f7c297b642fa58a3784851685e001931206b" exitCode=2 Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565344 4759 generic.go:334] "Generic (PLEG): container finished" podID="9435388e-3549-4f3f-b9d8-6889a0594255" containerID="846c8280b18b24f6b80d5c9b6e6618f62a8869787efbeab2a0c59341823b0cdd" exitCode=0 Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565355 4759 generic.go:334] "Generic (PLEG): container finished" podID="9435388e-3549-4f3f-b9d8-6889a0594255" containerID="8cc927ad357f792dd33c54e2a2ae8a00a92cb50b3b0486e79946bee5b5fdd65a" exitCode=0 Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565379 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerDied","Data":"478b8f4cfdb3ba3bff552bcac55b5f0853cf065f49396c06b441c33b22a5efb6"} Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565409 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerDied","Data":"09bb029dd4380c2c9d180ab80101f7c297b642fa58a3784851685e001931206b"} Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565421 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerDied","Data":"846c8280b18b24f6b80d5c9b6e6618f62a8869787efbeab2a0c59341823b0cdd"} Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.565433 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerDied","Data":"8cc927ad357f792dd33c54e2a2ae8a00a92cb50b3b0486e79946bee5b5fdd65a"} Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.784102 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914645 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914728 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914801 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914860 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914896 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914963 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.914990 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.915038 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4pdx\" (UniqueName: \"kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx\") pod \"9435388e-3549-4f3f-b9d8-6889a0594255\" (UID: \"9435388e-3549-4f3f-b9d8-6889a0594255\") " Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.915853 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.915973 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.921905 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts" (OuterVolumeSpecName: "scripts") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.934628 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx" (OuterVolumeSpecName: "kube-api-access-k4pdx") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "kube-api-access-k4pdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.949746 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:51 crc kubenswrapper[4759]: I1205 00:47:51.995075 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.014489 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018402 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018434 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018463 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4pdx\" (UniqueName: \"kubernetes.io/projected/9435388e-3549-4f3f-b9d8-6889a0594255-kube-api-access-k4pdx\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018475 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018483 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9435388e-3549-4f3f-b9d8-6889a0594255-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018753 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.018776 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.086040 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data" (OuterVolumeSpecName: "config-data") pod "9435388e-3549-4f3f-b9d8-6889a0594255" (UID: "9435388e-3549-4f3f-b9d8-6889a0594255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.120328 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9435388e-3549-4f3f-b9d8-6889a0594255-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.580988 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9435388e-3549-4f3f-b9d8-6889a0594255","Type":"ContainerDied","Data":"b397b6b19499ed0b94c87e234f8f092e0f86c3612eb446379459ea3ec479d613"} Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.581044 4759 scope.go:117] "RemoveContainer" containerID="478b8f4cfdb3ba3bff552bcac55b5f0853cf065f49396c06b441c33b22a5efb6" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.581147 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.604202 4759 scope.go:117] "RemoveContainer" containerID="09bb029dd4380c2c9d180ab80101f7c297b642fa58a3784851685e001931206b" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.630490 4759 scope.go:117] "RemoveContainer" containerID="846c8280b18b24f6b80d5c9b6e6618f62a8869787efbeab2a0c59341823b0cdd" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.635123 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.658881 4759 scope.go:117] "RemoveContainer" containerID="8cc927ad357f792dd33c54e2a2ae8a00a92cb50b3b0486e79946bee5b5fdd65a" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.664152 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.679742 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:52 crc kubenswrapper[4759]: E1205 00:47:52.680194 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerName="heat-engine" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680212 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerName="heat-engine" Dec 05 00:47:52 crc kubenswrapper[4759]: E1205 00:47:52.680222 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-notification-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680229 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-notification-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: E1205 00:47:52.680258 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="sg-core" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680264 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="sg-core" Dec 05 00:47:52 crc kubenswrapper[4759]: E1205 00:47:52.680278 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="proxy-httpd" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680284 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="proxy-httpd" Dec 05 00:47:52 crc kubenswrapper[4759]: E1205 00:47:52.680297 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-central-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680315 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-central-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680529 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="sg-core" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680542 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="proxy-httpd" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680551 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa7030d-e94e-4590-900a-8d5d41398af8" containerName="heat-engine" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680566 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-notification-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.680581 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" containerName="ceilometer-central-agent" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.682387 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.686632 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.686686 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.686733 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.719257 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852534 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852555 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852672 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852714 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852733 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5c8w\" (UniqueName: \"kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852868 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.852905 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.954998 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955053 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955072 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5c8w\" (UniqueName: \"kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955105 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955123 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955171 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955214 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955275 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955625 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.955659 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.959828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.959891 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.960974 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.968907 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.977406 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:52 crc kubenswrapper[4759]: I1205 00:47:52.977573 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5c8w\" (UniqueName: \"kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w\") pod \"ceilometer-0\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " pod="openstack/ceilometer-0" Dec 05 00:47:53 crc kubenswrapper[4759]: I1205 00:47:53.065502 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:47:53 crc kubenswrapper[4759]: I1205 00:47:53.166089 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9435388e-3549-4f3f-b9d8-6889a0594255" path="/var/lib/kubelet/pods/9435388e-3549-4f3f-b9d8-6889a0594255/volumes" Dec 05 00:47:53 crc kubenswrapper[4759]: I1205 00:47:53.560265 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:53 crc kubenswrapper[4759]: W1205 00:47:53.588744 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06635c0f_08c9_486e_95bc_5170cbcd00be.slice/crio-8a9abda287501192081315bb37c3ef242a832d2e84c0b55fa816d97db2105a36 WatchSource:0}: Error finding container 8a9abda287501192081315bb37c3ef242a832d2e84c0b55fa816d97db2105a36: Status 404 returned error can't find the container with id 8a9abda287501192081315bb37c3ef242a832d2e84c0b55fa816d97db2105a36 Dec 05 00:47:53 crc kubenswrapper[4759]: I1205 00:47:53.590004 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:47:54 crc kubenswrapper[4759]: I1205 00:47:54.606398 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerStarted","Data":"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011"} Dec 05 00:47:54 crc kubenswrapper[4759]: I1205 00:47:54.606780 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerStarted","Data":"8a9abda287501192081315bb37c3ef242a832d2e84c0b55fa816d97db2105a36"} Dec 05 00:47:56 crc kubenswrapper[4759]: I1205 00:47:56.645722 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerStarted","Data":"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764"} Dec 05 00:47:56 crc kubenswrapper[4759]: I1205 00:47:56.646037 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerStarted","Data":"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c"} Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.655941 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerStarted","Data":"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76"} Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.656566 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-central-agent" containerID="cri-o://b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011" gracePeriod=30 Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.656799 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.657078 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="proxy-httpd" containerID="cri-o://e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76" gracePeriod=30 Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.657119 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="sg-core" containerID="cri-o://7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764" gracePeriod=30 Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.657149 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-notification-agent" containerID="cri-o://134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c" gracePeriod=30 Dec 05 00:47:57 crc kubenswrapper[4759]: I1205 00:47:57.684141 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.949034516 podStartE2EDuration="5.684123999s" podCreationTimestamp="2025-12-05 00:47:52 +0000 UTC" firstStartedPulling="2025-12-05 00:47:53.592287563 +0000 UTC m=+1492.807948523" lastFinishedPulling="2025-12-05 00:47:57.327377056 +0000 UTC m=+1496.543038006" observedRunningTime="2025-12-05 00:47:57.678261655 +0000 UTC m=+1496.893922605" watchObservedRunningTime="2025-12-05 00:47:57.684123999 +0000 UTC m=+1496.899784949" Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.669666 4759 generic.go:334] "Generic (PLEG): container finished" podID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerID="e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76" exitCode=0 Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.671002 4759 generic.go:334] "Generic (PLEG): container finished" podID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerID="7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764" exitCode=2 Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.671103 4759 generic.go:334] "Generic (PLEG): container finished" podID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerID="134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c" exitCode=0 Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.669780 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerDied","Data":"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76"} Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.671351 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerDied","Data":"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764"} Dec 05 00:47:58 crc kubenswrapper[4759]: I1205 00:47:58.671455 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerDied","Data":"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c"} Dec 05 00:47:59 crc kubenswrapper[4759]: I1205 00:47:59.683723 4759 generic.go:334] "Generic (PLEG): container finished" podID="ee78c1b8-dbc7-496b-b757-4433f2764e67" containerID="889920e7c17f4f567f1b7c953223266e6fadf4eebc3eed05977ac09c646d6f2e" exitCode=0 Dec 05 00:47:59 crc kubenswrapper[4759]: I1205 00:47:59.683784 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" event={"ID":"ee78c1b8-dbc7-496b-b757-4433f2764e67","Type":"ContainerDied","Data":"889920e7c17f4f567f1b7c953223266e6fadf4eebc3eed05977ac09c646d6f2e"} Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.141864 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.223203 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data\") pod \"ee78c1b8-dbc7-496b-b757-4433f2764e67\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.223289 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts\") pod \"ee78c1b8-dbc7-496b-b757-4433f2764e67\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.223474 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle\") pod \"ee78c1b8-dbc7-496b-b757-4433f2764e67\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.223589 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7qq\" (UniqueName: \"kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq\") pod \"ee78c1b8-dbc7-496b-b757-4433f2764e67\" (UID: \"ee78c1b8-dbc7-496b-b757-4433f2764e67\") " Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.230260 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts" (OuterVolumeSpecName: "scripts") pod "ee78c1b8-dbc7-496b-b757-4433f2764e67" (UID: "ee78c1b8-dbc7-496b-b757-4433f2764e67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.245877 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq" (OuterVolumeSpecName: "kube-api-access-4n7qq") pod "ee78c1b8-dbc7-496b-b757-4433f2764e67" (UID: "ee78c1b8-dbc7-496b-b757-4433f2764e67"). InnerVolumeSpecName "kube-api-access-4n7qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.259535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee78c1b8-dbc7-496b-b757-4433f2764e67" (UID: "ee78c1b8-dbc7-496b-b757-4433f2764e67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.273477 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data" (OuterVolumeSpecName: "config-data") pod "ee78c1b8-dbc7-496b-b757-4433f2764e67" (UID: "ee78c1b8-dbc7-496b-b757-4433f2764e67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.326024 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.326067 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7qq\" (UniqueName: \"kubernetes.io/projected/ee78c1b8-dbc7-496b-b757-4433f2764e67-kube-api-access-4n7qq\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.326082 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.326094 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee78c1b8-dbc7-496b-b757-4433f2764e67-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.706690 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" event={"ID":"ee78c1b8-dbc7-496b-b757-4433f2764e67","Type":"ContainerDied","Data":"0fa954f2bd0e61334a5e062469f3788fa1497e9e3b087c99da22c6208e8f7f2b"} Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.707026 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fa954f2bd0e61334a5e062469f3788fa1497e9e3b087c99da22c6208e8f7f2b" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.706785 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l4fmq" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.863766 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 00:48:01 crc kubenswrapper[4759]: E1205 00:48:01.864348 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee78c1b8-dbc7-496b-b757-4433f2764e67" containerName="nova-cell0-conductor-db-sync" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.864371 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee78c1b8-dbc7-496b-b757-4433f2764e67" containerName="nova-cell0-conductor-db-sync" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.864734 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee78c1b8-dbc7-496b-b757-4433f2764e67" containerName="nova-cell0-conductor-db-sync" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.865642 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.867842 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rqh9c" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.869356 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.879239 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.942372 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.942484 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:01 crc kubenswrapper[4759]: I1205 00:48:01.942560 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75xx\" (UniqueName: \"kubernetes.io/projected/5ed8d71f-0481-4f8c-aed6-972efd952e3b-kube-api-access-c75xx\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.044120 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.044266 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.044368 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c75xx\" (UniqueName: \"kubernetes.io/projected/5ed8d71f-0481-4f8c-aed6-972efd952e3b-kube-api-access-c75xx\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.049153 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.053131 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8d71f-0481-4f8c-aed6-972efd952e3b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.064429 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75xx\" (UniqueName: \"kubernetes.io/projected/5ed8d71f-0481-4f8c-aed6-972efd952e3b-kube-api-access-c75xx\") pod \"nova-cell0-conductor-0\" (UID: \"5ed8d71f-0481-4f8c-aed6-972efd952e3b\") " pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.198213 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:02 crc kubenswrapper[4759]: I1205 00:48:02.705823 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.613594 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.676610 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.676662 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.676743 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.676763 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.677072 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.677162 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.677198 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.677537 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.677577 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5c8w\" (UniqueName: \"kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w\") pod \"06635c0f-08c9-486e-95bc-5170cbcd00be\" (UID: \"06635c0f-08c9-486e-95bc-5170cbcd00be\") " Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.678099 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.678155 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.698500 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts" (OuterVolumeSpecName: "scripts") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.701484 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w" (OuterVolumeSpecName: "kube-api-access-h5c8w") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "kube-api-access-h5c8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.703950 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.736475 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5ed8d71f-0481-4f8c-aed6-972efd952e3b","Type":"ContainerStarted","Data":"e57fd92254a5df50c87fd0b1044d870cf341005c337ba2921f1cccb7bbbbc280"} Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.736521 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5ed8d71f-0481-4f8c-aed6-972efd952e3b","Type":"ContainerStarted","Data":"89261d4dc900a96746fecc3ef6ef168afb4d7b5dad1f0ba5fe58017ba57bfe28"} Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.736555 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.738876 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.741788 4759 generic.go:334] "Generic (PLEG): container finished" podID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerID="b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011" exitCode=0 Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.741887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerDied","Data":"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011"} Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.741960 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06635c0f-08c9-486e-95bc-5170cbcd00be","Type":"ContainerDied","Data":"8a9abda287501192081315bb37c3ef242a832d2e84c0b55fa816d97db2105a36"} Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.742047 4759 scope.go:117] "RemoveContainer" containerID="e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.742206 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.760525 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.760509437 podStartE2EDuration="2.760509437s" podCreationTimestamp="2025-12-05 00:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:03.752765217 +0000 UTC m=+1502.968426167" watchObservedRunningTime="2025-12-05 00:48:03.760509437 +0000 UTC m=+1502.976170387" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.766952 4759 scope.go:117] "RemoveContainer" containerID="7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.779614 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.779640 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.779650 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06635c0f-08c9-486e-95bc-5170cbcd00be-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.779659 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5c8w\" (UniqueName: \"kubernetes.io/projected/06635c0f-08c9-486e-95bc-5170cbcd00be-kube-api-access-h5c8w\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.779669 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.786511 4759 scope.go:117] "RemoveContainer" containerID="134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.800241 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.804617 4759 scope.go:117] "RemoveContainer" containerID="b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.810816 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data" (OuterVolumeSpecName: "config-data") pod "06635c0f-08c9-486e-95bc-5170cbcd00be" (UID: "06635c0f-08c9-486e-95bc-5170cbcd00be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.825623 4759 scope.go:117] "RemoveContainer" containerID="e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76" Dec 05 00:48:03 crc kubenswrapper[4759]: E1205 00:48:03.826038 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76\": container with ID starting with e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76 not found: ID does not exist" containerID="e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.826148 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76"} err="failed to get container status \"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76\": rpc error: code = NotFound desc = could not find container \"e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76\": container with ID starting with e6b6445ed7cbd1d900857fbce1cdcfe936ed650568d8eef0b09c0a958739dd76 not found: ID does not exist" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.826239 4759 scope.go:117] "RemoveContainer" containerID="7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764" Dec 05 00:48:03 crc kubenswrapper[4759]: E1205 00:48:03.826698 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764\": container with ID starting with 7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764 not found: ID does not exist" containerID="7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.826774 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764"} err="failed to get container status \"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764\": rpc error: code = NotFound desc = could not find container \"7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764\": container with ID starting with 7dc6f668ca96ef3962c6e1b9e305a7350e8b9e91bb01c818ac6b865a23f8b764 not found: ID does not exist" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.826907 4759 scope.go:117] "RemoveContainer" containerID="134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c" Dec 05 00:48:03 crc kubenswrapper[4759]: E1205 00:48:03.827258 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c\": container with ID starting with 134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c not found: ID does not exist" containerID="134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.827283 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c"} err="failed to get container status \"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c\": rpc error: code = NotFound desc = could not find container \"134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c\": container with ID starting with 134ec4c168051f099f12b11815745998066425f000f18bb4a47e20b11d630b5c not found: ID does not exist" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.827299 4759 scope.go:117] "RemoveContainer" containerID="b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011" Dec 05 00:48:03 crc kubenswrapper[4759]: E1205 00:48:03.827720 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011\": container with ID starting with b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011 not found: ID does not exist" containerID="b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.827741 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011"} err="failed to get container status \"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011\": rpc error: code = NotFound desc = could not find container \"b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011\": container with ID starting with b1bee61857a796d057d07b3249074f7730a9d012e613373871f25eb4157be011 not found: ID does not exist" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.881984 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:03 crc kubenswrapper[4759]: I1205 00:48:03.882017 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06635c0f-08c9-486e-95bc-5170cbcd00be-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.128668 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.144198 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161007 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.161486 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-notification-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161506 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-notification-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.161524 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="proxy-httpd" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161531 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="proxy-httpd" Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.161557 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="sg-core" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161563 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="sg-core" Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.161578 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-central-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161584 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-central-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161761 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-notification-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161794 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="ceilometer-central-agent" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161814 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="sg-core" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.161827 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" containerName="proxy-httpd" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.163579 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.167097 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.167545 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.167872 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.176122 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289408 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289497 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289560 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289588 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289865 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289922 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289951 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.289974 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jslst\" (UniqueName: \"kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391251 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391327 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391353 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391429 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391457 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391473 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391497 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jslst\" (UniqueName: \"kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391540 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391736 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.391938 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.396499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.399844 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.406362 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.410210 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.410633 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.411349 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jslst\" (UniqueName: \"kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst\") pod \"ceilometer-0\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.433399 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.433464 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.433518 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.434534 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.434606 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" gracePeriod=600 Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.522804 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.555709 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.753096 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" exitCode=0 Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.753182 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1"} Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.753420 4759 scope.go:117] "RemoveContainer" containerID="398c049422bc0a204c0d315a98c453ec61f78c5e1ec3ea26a6b6394b0111caae" Dec 05 00:48:04 crc kubenswrapper[4759]: I1205 00:48:04.754064 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:48:04 crc kubenswrapper[4759]: E1205 00:48:04.754398 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:05 crc kubenswrapper[4759]: W1205 00:48:05.009280 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod095df254_d47f_41fe_bb00_806b9ce2d40d.slice/crio-8b24625ad183f5152084663378a424cbb16ba1f4c6eaa38668203d39123d5c1b WatchSource:0}: Error finding container 8b24625ad183f5152084663378a424cbb16ba1f4c6eaa38668203d39123d5c1b: Status 404 returned error can't find the container with id 8b24625ad183f5152084663378a424cbb16ba1f4c6eaa38668203d39123d5c1b Dec 05 00:48:05 crc kubenswrapper[4759]: I1205 00:48:05.012464 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:05 crc kubenswrapper[4759]: I1205 00:48:05.178365 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06635c0f-08c9-486e-95bc-5170cbcd00be" path="/var/lib/kubelet/pods/06635c0f-08c9-486e-95bc-5170cbcd00be/volumes" Dec 05 00:48:05 crc kubenswrapper[4759]: I1205 00:48:05.408128 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:05 crc kubenswrapper[4759]: I1205 00:48:05.772769 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerStarted","Data":"4c93ec9f664ce26832935cc3a0e70d49806a8ec7d11470495fd1b9bf4c1e15a5"} Dec 05 00:48:05 crc kubenswrapper[4759]: I1205 00:48:05.773036 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerStarted","Data":"8b24625ad183f5152084663378a424cbb16ba1f4c6eaa38668203d39123d5c1b"} Dec 05 00:48:06 crc kubenswrapper[4759]: I1205 00:48:06.784423 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerStarted","Data":"0ffeb111de94f467863c0ca42d4c6db5b58c288418ea7226fb0f8b1720a45ef3"} Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.245379 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.782805 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bsbqv"] Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.784826 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.788755 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.788986 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.796497 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bsbqv"] Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.798913 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerStarted","Data":"75b67ec57be4991279d4847e4b903680a5895959224a8fd1fd4e40b29319ade4"} Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.865433 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.865494 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcwd\" (UniqueName: \"kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.865519 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.865594 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.901739 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.903844 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.912868 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.928113 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.985331 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcwd\" (UniqueName: \"kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.985394 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.985599 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:07 crc kubenswrapper[4759]: I1205 00:48:07.985861 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.021991 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcwd\" (UniqueName: \"kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.057968 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.073909 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.076234 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bsbqv\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.097030 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfqn\" (UniqueName: \"kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.097272 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.097513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.097602 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.108895 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.202506 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-t4ld2"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.204702 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.205943 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.207050 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfqn\" (UniqueName: \"kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.207529 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.213263 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.213101 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.206329 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.238696 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfqn\" (UniqueName: \"kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.254417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.286664 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.301548 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.316798 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.316962 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.317021 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gkq\" (UniqueName: \"kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.343442 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-t4ld2"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.359820 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.370153 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.372620 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.380078 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.381163 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.388644 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-6d43-account-create-update-78sjq"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.391193 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.397515 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.398961 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.400544 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.402335 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.406390 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-6d43-account-create-update-78sjq"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.415473 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.418366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.418410 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gkq\" (UniqueName: \"kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.418491 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tsq5\" (UniqueName: \"kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.418516 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.418571 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.419292 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.429826 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.436479 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.442410 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.448412 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gkq\" (UniqueName: \"kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq\") pod \"aodh-db-create-t4ld2\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520589 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpdd6\" (UniqueName: \"kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520651 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520668 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520701 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520738 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520769 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7hf\" (UniqueName: \"kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520825 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520861 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520892 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tsq5\" (UniqueName: \"kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520910 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsbt\" (UniqueName: \"kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.520928 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.529669 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.538030 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.538532 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.548982 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tsq5\" (UniqueName: \"kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5\") pod \"nova-scheduler-0\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.596795 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.629848 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.629906 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.629957 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630000 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7hf\" (UniqueName: \"kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630032 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630065 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgq5z\" (UniqueName: \"kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630123 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630176 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630203 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630238 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsbt\" (UniqueName: \"kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630256 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630293 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630333 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpdd6\" (UniqueName: \"kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630359 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630380 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.630838 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.632092 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.634151 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.634965 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.644043 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.658659 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.659531 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsbt\" (UniqueName: \"kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.659604 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7hf\" (UniqueName: \"kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf\") pod \"nova-metadata-0\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.663221 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpdd6\" (UniqueName: \"kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6\") pod \"aodh-6d43-account-create-update-78sjq\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.666045 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.726519 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743401 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743445 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743495 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743517 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgq5z\" (UniqueName: \"kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743584 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.743612 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.744272 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.744391 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.744819 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.745110 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.749716 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.768450 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgq5z\" (UniqueName: \"kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z\") pod \"dnsmasq-dns-7877d89589-b5m5r\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.779848 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bsbqv"] Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.784065 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.806354 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.816921 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.861501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerStarted","Data":"ce0b8baf60a46e17e0207b096ec20b59612a0a8644de870bce52aac881fd69a1"} Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.861840 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-central-agent" containerID="cri-o://4c93ec9f664ce26832935cc3a0e70d49806a8ec7d11470495fd1b9bf4c1e15a5" gracePeriod=30 Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.861875 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.861971 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="proxy-httpd" containerID="cri-o://ce0b8baf60a46e17e0207b096ec20b59612a0a8644de870bce52aac881fd69a1" gracePeriod=30 Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.862025 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="sg-core" containerID="cri-o://75b67ec57be4991279d4847e4b903680a5895959224a8fd1fd4e40b29319ade4" gracePeriod=30 Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.862057 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-notification-agent" containerID="cri-o://0ffeb111de94f467863c0ca42d4c6db5b58c288418ea7226fb0f8b1720a45ef3" gracePeriod=30 Dec 05 00:48:08 crc kubenswrapper[4759]: I1205 00:48:08.890447 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.789746911 podStartE2EDuration="4.890426952s" podCreationTimestamp="2025-12-05 00:48:04 +0000 UTC" firstStartedPulling="2025-12-05 00:48:05.012807308 +0000 UTC m=+1504.228468278" lastFinishedPulling="2025-12-05 00:48:08.113487369 +0000 UTC m=+1507.329148319" observedRunningTime="2025-12-05 00:48:08.886939286 +0000 UTC m=+1508.102600236" watchObservedRunningTime="2025-12-05 00:48:08.890426952 +0000 UTC m=+1508.106087902" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.029791 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.218613 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:09 crc kubenswrapper[4759]: W1205 00:48:09.258359 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a80abf5_1bae_43c5_b9f7_9bd20c9fee2d.slice/crio-690c67458496c86cc8cdf4cd3348f5ba69d669b01e3455ebee0cac4e93577b5b WatchSource:0}: Error finding container 690c67458496c86cc8cdf4cd3348f5ba69d669b01e3455ebee0cac4e93577b5b: Status 404 returned error can't find the container with id 690c67458496c86cc8cdf4cd3348f5ba69d669b01e3455ebee0cac4e93577b5b Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.563112 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-t4ld2"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.642534 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pntsp"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.643795 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.647718 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.648085 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.649904 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pntsp"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.666375 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbhf6\" (UniqueName: \"kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.666415 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.666461 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.666475 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.769588 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbhf6\" (UniqueName: \"kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.769644 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.769711 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.769728 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.775435 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.775452 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.779329 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.790537 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbhf6\" (UniqueName: \"kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6\") pod \"nova-cell1-conductor-db-sync-pntsp\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.846394 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-6d43-account-create-update-78sjq"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.863428 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:09 crc kubenswrapper[4759]: W1205 00:48:09.865424 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa2ac07a_67f0_4c83_a96b_8f745f96d18d.slice/crio-d85ac19aa234f2a52753ee3894af2c05ccee8c31c6f82165e7ebd41e3d2aa212 WatchSource:0}: Error finding container d85ac19aa234f2a52753ee3894af2c05ccee8c31c6f82165e7ebd41e3d2aa212: Status 404 returned error can't find the container with id d85ac19aa234f2a52753ee3894af2c05ccee8c31c6f82165e7ebd41e3d2aa212 Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.875817 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d","Type":"ContainerStarted","Data":"690c67458496c86cc8cdf4cd3348f5ba69d669b01e3455ebee0cac4e93577b5b"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.878659 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.881364 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerStarted","Data":"45f4b8e83818dbc96d6fa6b8e89bbdaaf08ffd8a9411d7f0fc11999323111222"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.883042 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bsbqv" event={"ID":"15826122-66ee-4470-a806-6328f0fbb58c","Type":"ContainerStarted","Data":"dfc2673f33c52fb8fef51d579367ce11bac54934c259f3e3b8474fcfa6281b66"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.883065 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bsbqv" event={"ID":"15826122-66ee-4470-a806-6328f0fbb58c","Type":"ContainerStarted","Data":"03d217dfa0c3eb9e29db3fec92defe6e5aa55bf854b369bfc42bb2872637c3bc"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.891018 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896740 4759 generic.go:334] "Generic (PLEG): container finished" podID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerID="ce0b8baf60a46e17e0207b096ec20b59612a0a8644de870bce52aac881fd69a1" exitCode=0 Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896765 4759 generic.go:334] "Generic (PLEG): container finished" podID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerID="75b67ec57be4991279d4847e4b903680a5895959224a8fd1fd4e40b29319ade4" exitCode=2 Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896773 4759 generic.go:334] "Generic (PLEG): container finished" podID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerID="0ffeb111de94f467863c0ca42d4c6db5b58c288418ea7226fb0f8b1720a45ef3" exitCode=0 Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896838 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerDied","Data":"ce0b8baf60a46e17e0207b096ec20b59612a0a8644de870bce52aac881fd69a1"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896859 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerDied","Data":"75b67ec57be4991279d4847e4b903680a5895959224a8fd1fd4e40b29319ade4"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.896869 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerDied","Data":"0ffeb111de94f467863c0ca42d4c6db5b58c288418ea7226fb0f8b1720a45ef3"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.903997 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bsbqv" podStartSLOduration=2.903978903 podStartE2EDuration="2.903978903s" podCreationTimestamp="2025-12-05 00:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:09.901938783 +0000 UTC m=+1509.117599733" watchObservedRunningTime="2025-12-05 00:48:09.903978903 +0000 UTC m=+1509.119639853" Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.906213 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" event={"ID":"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b","Type":"ContainerStarted","Data":"3c8a46c221aeab56f69f8614dd3e8fd47ca4219fbe767420cabca4978244a590"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.909843 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-t4ld2" event={"ID":"26b2620d-e545-46ea-b97e-3fb934b5053c","Type":"ContainerStarted","Data":"be3dbd8d61f0df32e6d2a66eeb7d6787215f1c0835046283e09095c67f8989e4"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.909886 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-t4ld2" event={"ID":"26b2620d-e545-46ea-b97e-3fb934b5053c","Type":"ContainerStarted","Data":"3c82379d9b21dd671f8f9027e65c3a34fd28445029ab09e2fbb965222adadd73"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.914727 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6d43-account-create-update-78sjq" event={"ID":"26a46bc1-98c5-4362-93fe-9a2c140cb04d","Type":"ContainerStarted","Data":"76b07b07d39fa0474193765a887781295addd13f9c9ffa97a562af287851af8d"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.916859 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerStarted","Data":"eda421e39e788977dc71f065c502a179d72800d19b8a65a36a77e42479a6ffc8"} Dec 05 00:48:09 crc kubenswrapper[4759]: I1205 00:48:09.998073 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.508238 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-t4ld2" podStartSLOduration=3.508218189 podStartE2EDuration="3.508218189s" podCreationTimestamp="2025-12-05 00:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:09.923868849 +0000 UTC m=+1509.139529799" watchObservedRunningTime="2025-12-05 00:48:10.508218189 +0000 UTC m=+1509.723879139" Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.525238 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pntsp"] Dec 05 00:48:10 crc kubenswrapper[4759]: W1205 00:48:10.578422 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2f98436_ab15_4fb1_91d5_2b1ae63f25ef.slice/crio-ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b WatchSource:0}: Error finding container ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b: Status 404 returned error can't find the container with id ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.955900 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pntsp" event={"ID":"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef","Type":"ContainerStarted","Data":"ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b"} Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.959731 4759 generic.go:334] "Generic (PLEG): container finished" podID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerID="64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8" exitCode=0 Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.959825 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" event={"ID":"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b","Type":"ContainerDied","Data":"64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8"} Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.971573 4759 generic.go:334] "Generic (PLEG): container finished" podID="26b2620d-e545-46ea-b97e-3fb934b5053c" containerID="be3dbd8d61f0df32e6d2a66eeb7d6787215f1c0835046283e09095c67f8989e4" exitCode=0 Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.971686 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-t4ld2" event={"ID":"26b2620d-e545-46ea-b97e-3fb934b5053c","Type":"ContainerDied","Data":"be3dbd8d61f0df32e6d2a66eeb7d6787215f1c0835046283e09095c67f8989e4"} Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.975280 4759 generic.go:334] "Generic (PLEG): container finished" podID="26a46bc1-98c5-4362-93fe-9a2c140cb04d" containerID="fa80d792c5fc55a9de968aea81dc994418f0f1b8c9ecd2b4c368817386f08df0" exitCode=0 Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.975358 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6d43-account-create-update-78sjq" event={"ID":"26a46bc1-98c5-4362-93fe-9a2c140cb04d","Type":"ContainerDied","Data":"fa80d792c5fc55a9de968aea81dc994418f0f1b8c9ecd2b4c368817386f08df0"} Dec 05 00:48:10 crc kubenswrapper[4759]: I1205 00:48:10.987092 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aa2ac07a-67f0-4c83-a96b-8f745f96d18d","Type":"ContainerStarted","Data":"d85ac19aa234f2a52753ee3894af2c05ccee8c31c6f82165e7ebd41e3d2aa212"} Dec 05 00:48:11 crc kubenswrapper[4759]: I1205 00:48:11.508775 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:11 crc kubenswrapper[4759]: I1205 00:48:11.520263 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.007550 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pntsp" event={"ID":"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef","Type":"ContainerStarted","Data":"3b9bebe3289956d7d5490210b0ff017669d6da94cdb9de41a3498a402b1bc7e0"} Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.030467 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-pntsp" podStartSLOduration=3.03036109 podStartE2EDuration="3.03036109s" podCreationTimestamp="2025-12-05 00:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:12.023013401 +0000 UTC m=+1511.238674341" watchObservedRunningTime="2025-12-05 00:48:12.03036109 +0000 UTC m=+1511.246022040" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.777773 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.785057 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.849162 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts\") pod \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.849260 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpdd6\" (UniqueName: \"kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6\") pod \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\" (UID: \"26a46bc1-98c5-4362-93fe-9a2c140cb04d\") " Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.849356 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gkq\" (UniqueName: \"kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq\") pod \"26b2620d-e545-46ea-b97e-3fb934b5053c\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.849381 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts\") pod \"26b2620d-e545-46ea-b97e-3fb934b5053c\" (UID: \"26b2620d-e545-46ea-b97e-3fb934b5053c\") " Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.851800 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26b2620d-e545-46ea-b97e-3fb934b5053c" (UID: "26b2620d-e545-46ea-b97e-3fb934b5053c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.851801 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26a46bc1-98c5-4362-93fe-9a2c140cb04d" (UID: "26a46bc1-98c5-4362-93fe-9a2c140cb04d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.859492 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6" (OuterVolumeSpecName: "kube-api-access-qpdd6") pod "26a46bc1-98c5-4362-93fe-9a2c140cb04d" (UID: "26a46bc1-98c5-4362-93fe-9a2c140cb04d"). InnerVolumeSpecName "kube-api-access-qpdd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.859545 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq" (OuterVolumeSpecName: "kube-api-access-c6gkq") pod "26b2620d-e545-46ea-b97e-3fb934b5053c" (UID: "26b2620d-e545-46ea-b97e-3fb934b5053c"). InnerVolumeSpecName "kube-api-access-c6gkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.951767 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a46bc1-98c5-4362-93fe-9a2c140cb04d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.951792 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpdd6\" (UniqueName: \"kubernetes.io/projected/26a46bc1-98c5-4362-93fe-9a2c140cb04d-kube-api-access-qpdd6\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.951804 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gkq\" (UniqueName: \"kubernetes.io/projected/26b2620d-e545-46ea-b97e-3fb934b5053c-kube-api-access-c6gkq\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:12 crc kubenswrapper[4759]: I1205 00:48:12.951812 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26b2620d-e545-46ea-b97e-3fb934b5053c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.020644 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-t4ld2" event={"ID":"26b2620d-e545-46ea-b97e-3fb934b5053c","Type":"ContainerDied","Data":"3c82379d9b21dd671f8f9027e65c3a34fd28445029ab09e2fbb965222adadd73"} Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.020680 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c82379d9b21dd671f8f9027e65c3a34fd28445029ab09e2fbb965222adadd73" Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.020676 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-t4ld2" Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.024131 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6d43-account-create-update-78sjq" Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.024183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6d43-account-create-update-78sjq" event={"ID":"26a46bc1-98c5-4362-93fe-9a2c140cb04d","Type":"ContainerDied","Data":"76b07b07d39fa0474193765a887781295addd13f9c9ffa97a562af287851af8d"} Dec 05 00:48:13 crc kubenswrapper[4759]: I1205 00:48:13.024241 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b07b07d39fa0474193765a887781295addd13f9c9ffa97a562af287851af8d" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.041135 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerStarted","Data":"c6c20032a68e4777e37abb6e04d9ce16aba62e8b83af9c129f0017656c2878ca"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.041574 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerStarted","Data":"42d72ec345784c9efad3ab9637081e0ae1af2b0c362d1e0be19903481debfa7a"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.041456 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-log" containerID="cri-o://42d72ec345784c9efad3ab9637081e0ae1af2b0c362d1e0be19903481debfa7a" gracePeriod=30 Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.041730 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-metadata" containerID="cri-o://c6c20032a68e4777e37abb6e04d9ce16aba62e8b83af9c129f0017656c2878ca" gracePeriod=30 Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.047019 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d","Type":"ContainerStarted","Data":"4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.050256 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aa2ac07a-67f0-4c83-a96b-8f745f96d18d","Type":"ContainerStarted","Data":"0299f3d4052f344401bbf464bf2774772bba2187121f10b20ca6953fa77ebfc5"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.050479 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0299f3d4052f344401bbf464bf2774772bba2187121f10b20ca6953fa77ebfc5" gracePeriod=30 Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.057891 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerStarted","Data":"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.057942 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerStarted","Data":"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.064281 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" event={"ID":"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b","Type":"ContainerStarted","Data":"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878"} Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.065155 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.065948 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.635519028 podStartE2EDuration="6.06592067s" podCreationTimestamp="2025-12-05 00:48:08 +0000 UTC" firstStartedPulling="2025-12-05 00:48:09.850785594 +0000 UTC m=+1509.066446544" lastFinishedPulling="2025-12-05 00:48:13.281187236 +0000 UTC m=+1512.496848186" observedRunningTime="2025-12-05 00:48:14.06059563 +0000 UTC m=+1513.276256590" watchObservedRunningTime="2025-12-05 00:48:14.06592067 +0000 UTC m=+1513.281581630" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.092253 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.66714078 podStartE2EDuration="6.092237973s" podCreationTimestamp="2025-12-05 00:48:08 +0000 UTC" firstStartedPulling="2025-12-05 00:48:09.87314691 +0000 UTC m=+1509.088807860" lastFinishedPulling="2025-12-05 00:48:13.298244103 +0000 UTC m=+1512.513905053" observedRunningTime="2025-12-05 00:48:14.08968492 +0000 UTC m=+1513.305345870" watchObservedRunningTime="2025-12-05 00:48:14.092237973 +0000 UTC m=+1513.307898923" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.126760 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9507326149999997 podStartE2EDuration="7.126739965s" podCreationTimestamp="2025-12-05 00:48:07 +0000 UTC" firstStartedPulling="2025-12-05 00:48:09.111638314 +0000 UTC m=+1508.327299264" lastFinishedPulling="2025-12-05 00:48:13.287645664 +0000 UTC m=+1512.503306614" observedRunningTime="2025-12-05 00:48:14.116353011 +0000 UTC m=+1513.332013961" watchObservedRunningTime="2025-12-05 00:48:14.126739965 +0000 UTC m=+1513.342400915" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.138789 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.132507324 podStartE2EDuration="7.138765669s" podCreationTimestamp="2025-12-05 00:48:07 +0000 UTC" firstStartedPulling="2025-12-05 00:48:09.272603535 +0000 UTC m=+1508.488264475" lastFinishedPulling="2025-12-05 00:48:13.27886187 +0000 UTC m=+1512.494522820" observedRunningTime="2025-12-05 00:48:14.133112591 +0000 UTC m=+1513.348773541" watchObservedRunningTime="2025-12-05 00:48:14.138765669 +0000 UTC m=+1513.354426619" Dec 05 00:48:14 crc kubenswrapper[4759]: I1205 00:48:14.163710 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" podStartSLOduration=6.163676607 podStartE2EDuration="6.163676607s" podCreationTimestamp="2025-12-05 00:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:14.158509551 +0000 UTC m=+1513.374170501" watchObservedRunningTime="2025-12-05 00:48:14.163676607 +0000 UTC m=+1513.379337547" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.089026 4759 generic.go:334] "Generic (PLEG): container finished" podID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerID="4c93ec9f664ce26832935cc3a0e70d49806a8ec7d11470495fd1b9bf4c1e15a5" exitCode=0 Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.089076 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerDied","Data":"4c93ec9f664ce26832935cc3a0e70d49806a8ec7d11470495fd1b9bf4c1e15a5"} Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.093444 4759 generic.go:334] "Generic (PLEG): container finished" podID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerID="42d72ec345784c9efad3ab9637081e0ae1af2b0c362d1e0be19903481debfa7a" exitCode=143 Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.094824 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerDied","Data":"42d72ec345784c9efad3ab9637081e0ae1af2b0c362d1e0be19903481debfa7a"} Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.156369 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:48:15 crc kubenswrapper[4759]: E1205 00:48:15.156887 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.408857 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.525892 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.525985 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526021 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jslst\" (UniqueName: \"kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526107 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526125 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526158 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526217 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526278 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts\") pod \"095df254-d47f-41fe-bb00-806b9ce2d40d\" (UID: \"095df254-d47f-41fe-bb00-806b9ce2d40d\") " Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526297 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.526484 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.527344 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.527362 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095df254-d47f-41fe-bb00-806b9ce2d40d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.534246 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst" (OuterVolumeSpecName: "kube-api-access-jslst") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "kube-api-access-jslst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.534849 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts" (OuterVolumeSpecName: "scripts") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.591515 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.596017 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.628689 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jslst\" (UniqueName: \"kubernetes.io/projected/095df254-d47f-41fe-bb00-806b9ce2d40d-kube-api-access-jslst\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.628857 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.628924 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.629059 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.643498 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.645767 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data" (OuterVolumeSpecName: "config-data") pod "095df254-d47f-41fe-bb00-806b9ce2d40d" (UID: "095df254-d47f-41fe-bb00-806b9ce2d40d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.731003 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:15 crc kubenswrapper[4759]: I1205 00:48:15.731034 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095df254-d47f-41fe-bb00-806b9ce2d40d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.105747 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.105758 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095df254-d47f-41fe-bb00-806b9ce2d40d","Type":"ContainerDied","Data":"8b24625ad183f5152084663378a424cbb16ba1f4c6eaa38668203d39123d5c1b"} Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.106725 4759 scope.go:117] "RemoveContainer" containerID="ce0b8baf60a46e17e0207b096ec20b59612a0a8644de870bce52aac881fd69a1" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.147728 4759 scope.go:117] "RemoveContainer" containerID="75b67ec57be4991279d4847e4b903680a5895959224a8fd1fd4e40b29319ade4" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.149731 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.168390 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.181155 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.181831 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a46bc1-98c5-4362-93fe-9a2c140cb04d" containerName="mariadb-account-create-update" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.181920 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a46bc1-98c5-4362-93fe-9a2c140cb04d" containerName="mariadb-account-create-update" Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.181982 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-central-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182037 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-central-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.182098 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b2620d-e545-46ea-b97e-3fb934b5053c" containerName="mariadb-database-create" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182153 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b2620d-e545-46ea-b97e-3fb934b5053c" containerName="mariadb-database-create" Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.182218 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="proxy-httpd" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182285 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="proxy-httpd" Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.182371 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-notification-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182435 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-notification-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: E1205 00:48:16.182503 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="sg-core" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182554 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="sg-core" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182795 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-central-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182867 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a46bc1-98c5-4362-93fe-9a2c140cb04d" containerName="mariadb-account-create-update" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.182990 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="sg-core" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.183072 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="ceilometer-notification-agent" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.183141 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" containerName="proxy-httpd" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.183198 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b2620d-e545-46ea-b97e-3fb934b5053c" containerName="mariadb-database-create" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.185221 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.188134 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.188434 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.189142 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.221947 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.245730 4759 scope.go:117] "RemoveContainer" containerID="0ffeb111de94f467863c0ca42d4c6db5b58c288418ea7226fb0f8b1720a45ef3" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.251593 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.251629 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqkw\" (UniqueName: \"kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.251653 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.251680 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.251714 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.252101 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.252154 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.252184 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.272056 4759 scope.go:117] "RemoveContainer" containerID="4c93ec9f664ce26832935cc3a0e70d49806a8ec7d11470495fd1b9bf4c1e15a5" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.354122 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.354411 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqkw\" (UniqueName: \"kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.354524 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.354681 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.356706 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.356843 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.356987 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.357086 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.357721 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.356097 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.360718 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.363994 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.366637 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.374127 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.374864 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.383620 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqkw\" (UniqueName: \"kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw\") pod \"ceilometer-0\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " pod="openstack/ceilometer-0" Dec 05 00:48:16 crc kubenswrapper[4759]: I1205 00:48:16.537055 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:17 crc kubenswrapper[4759]: I1205 00:48:17.040940 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:17 crc kubenswrapper[4759]: I1205 00:48:17.119640 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerStarted","Data":"21de9b67f5b3613957c9741610ab464c8df1f195c60195c3a81ef6ac7ee9be91"} Dec 05 00:48:17 crc kubenswrapper[4759]: I1205 00:48:17.121378 4759 generic.go:334] "Generic (PLEG): container finished" podID="15826122-66ee-4470-a806-6328f0fbb58c" containerID="dfc2673f33c52fb8fef51d579367ce11bac54934c259f3e3b8474fcfa6281b66" exitCode=0 Dec 05 00:48:17 crc kubenswrapper[4759]: I1205 00:48:17.121391 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bsbqv" event={"ID":"15826122-66ee-4470-a806-6328f0fbb58c","Type":"ContainerDied","Data":"dfc2673f33c52fb8fef51d579367ce11bac54934c259f3e3b8474fcfa6281b66"} Dec 05 00:48:17 crc kubenswrapper[4759]: I1205 00:48:17.173591 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095df254-d47f-41fe-bb00-806b9ce2d40d" path="/var/lib/kubelet/pods/095df254-d47f-41fe-bb00-806b9ce2d40d/volumes" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.132899 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerStarted","Data":"9bca42c9fbcb33a3a1562e999f2308ae4f0ac371dc7da7e17be6d6af558ed7c0"} Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.134366 4759 generic.go:334] "Generic (PLEG): container finished" podID="d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" containerID="3b9bebe3289956d7d5490210b0ff017669d6da94cdb9de41a3498a402b1bc7e0" exitCode=0 Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.134427 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pntsp" event={"ID":"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef","Type":"ContainerDied","Data":"3b9bebe3289956d7d5490210b0ff017669d6da94cdb9de41a3498a402b1bc7e0"} Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.530423 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-5f58p"] Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.532359 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.539859 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.540237 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.540996 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.541117 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.541169 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mr6x2" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.541458 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.549746 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5f58p"] Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.618113 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.618401 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.618449 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshwn\" (UniqueName: \"kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.618503 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.623589 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.668840 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.668910 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.708946 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.720372 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdcwd\" (UniqueName: \"kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd\") pod \"15826122-66ee-4470-a806-6328f0fbb58c\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.720771 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle\") pod \"15826122-66ee-4470-a806-6328f0fbb58c\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.723605 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data\") pod \"15826122-66ee-4470-a806-6328f0fbb58c\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.723757 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts\") pod \"15826122-66ee-4470-a806-6328f0fbb58c\" (UID: \"15826122-66ee-4470-a806-6328f0fbb58c\") " Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.726657 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.726758 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.726918 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jshwn\" (UniqueName: \"kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.727064 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.730398 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts" (OuterVolumeSpecName: "scripts") pod "15826122-66ee-4470-a806-6328f0fbb58c" (UID: "15826122-66ee-4470-a806-6328f0fbb58c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.733022 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.734412 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.734460 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.737135 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.738010 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.743736 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshwn\" (UniqueName: \"kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn\") pod \"aodh-db-sync-5f58p\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.762513 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd" (OuterVolumeSpecName: "kube-api-access-rdcwd") pod "15826122-66ee-4470-a806-6328f0fbb58c" (UID: "15826122-66ee-4470-a806-6328f0fbb58c"). InnerVolumeSpecName "kube-api-access-rdcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.774209 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data" (OuterVolumeSpecName: "config-data") pod "15826122-66ee-4470-a806-6328f0fbb58c" (UID: "15826122-66ee-4470-a806-6328f0fbb58c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.797847 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15826122-66ee-4470-a806-6328f0fbb58c" (UID: "15826122-66ee-4470-a806-6328f0fbb58c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.807002 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.819528 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.829211 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdcwd\" (UniqueName: \"kubernetes.io/projected/15826122-66ee-4470-a806-6328f0fbb58c-kube-api-access-rdcwd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.829513 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.829575 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.829637 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15826122-66ee-4470-a806-6328f0fbb58c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.883830 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.884161 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="dnsmasq-dns" containerID="cri-o://66e366d87ac58c7498e948a4c3245518cd0f291399d2d92d946b5597530cb083" gracePeriod=10 Dec 05 00:48:18 crc kubenswrapper[4759]: I1205 00:48:18.936707 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.159138 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bsbqv" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.176242 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bsbqv" event={"ID":"15826122-66ee-4470-a806-6328f0fbb58c","Type":"ContainerDied","Data":"03d217dfa0c3eb9e29db3fec92defe6e5aa55bf854b369bfc42bb2872637c3bc"} Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.176284 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d217dfa0c3eb9e29db3fec92defe6e5aa55bf854b369bfc42bb2872637c3bc" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.182356 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerID="66e366d87ac58c7498e948a4c3245518cd0f291399d2d92d946b5597530cb083" exitCode=0 Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.182440 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" event={"ID":"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b","Type":"ContainerDied","Data":"66e366d87ac58c7498e948a4c3245518cd0f291399d2d92d946b5597530cb083"} Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.185459 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerStarted","Data":"045cbf82ddbb75a9caf5c5a8a1ee7cb4b38e18c7610bf0dc8539e4a6be98f56c"} Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.228408 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.234303 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.237836 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.692136 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.692450 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.786825 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5f58p"] Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.836633 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853348 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853599 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853691 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853750 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853799 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.853848 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9j7\" (UniqueName: \"kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7\") pod \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\" (UID: \"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b\") " Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.863189 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7" (OuterVolumeSpecName: "kube-api-access-4p9j7") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "kube-api-access-4p9j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:19 crc kubenswrapper[4759]: I1205 00:48:19.958439 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9j7\" (UniqueName: \"kubernetes.io/projected/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-kube-api-access-4p9j7\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.064889 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config" (OuterVolumeSpecName: "config") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.065124 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.088426 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.102421 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.119815 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.129842 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" (UID: "3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.161438 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.161472 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.161484 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.161492 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.161502 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.211604 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerStarted","Data":"37efa3277bf835ca8e9c882acc6c888664033213e07c04b2f8e629679366ef8c"} Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.214685 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" event={"ID":"3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b","Type":"ContainerDied","Data":"9157554f119a9ae5048111ea08e515054854a9439f98d11cf915557f82307a50"} Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.214735 4759 scope.go:117] "RemoveContainer" containerID="66e366d87ac58c7498e948a4c3245518cd0f291399d2d92d946b5597530cb083" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.214860 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.217762 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pntsp" event={"ID":"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef","Type":"ContainerDied","Data":"ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b"} Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.217794 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff9d0cbff861c567395c6a7d95247cad9ab380b20f255f70a0d846523f3a538b" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.217847 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pntsp" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.219972 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5f58p" event={"ID":"bf28a30c-49fe-4e6f-8684-eca499f44133","Type":"ContainerStarted","Data":"2a8037f354acf8cf2fb2d5f3ddb1ccfdae6b461e70675557474c54ffd65b9c11"} Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.220119 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-log" containerID="cri-o://93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385" gracePeriod=30 Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.220377 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-api" containerID="cri-o://10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0" gracePeriod=30 Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.223939 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 00:48:20 crc kubenswrapper[4759]: E1205 00:48:20.224448 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" containerName="nova-cell1-conductor-db-sync" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224462 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" containerName="nova-cell1-conductor-db-sync" Dec 05 00:48:20 crc kubenswrapper[4759]: E1205 00:48:20.224488 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="dnsmasq-dns" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224496 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="dnsmasq-dns" Dec 05 00:48:20 crc kubenswrapper[4759]: E1205 00:48:20.224512 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15826122-66ee-4470-a806-6328f0fbb58c" containerName="nova-manage" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224520 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="15826122-66ee-4470-a806-6328f0fbb58c" containerName="nova-manage" Dec 05 00:48:20 crc kubenswrapper[4759]: E1205 00:48:20.224538 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="init" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224546 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="init" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224771 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="dnsmasq-dns" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224800 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" containerName="nova-cell1-conductor-db-sync" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.224812 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="15826122-66ee-4470-a806-6328f0fbb58c" containerName="nova-manage" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.225542 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.264110 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle\") pod \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.264553 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data\") pod \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.264680 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts\") pod \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.264765 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbhf6\" (UniqueName: \"kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6\") pod \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\" (UID: \"d2f98436-ab15-4fb1-91d5-2b1ae63f25ef\") " Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.265719 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.270364 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts" (OuterVolumeSpecName: "scripts") pod "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" (UID: "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.270876 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6" (OuterVolumeSpecName: "kube-api-access-bbhf6") pod "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" (UID: "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef"). InnerVolumeSpecName "kube-api-access-bbhf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.280247 4759 scope.go:117] "RemoveContainer" containerID="b28b2c6a679dc5b38657a02d51c8835bf2f32d1902fe3a2efb704bbd5848a814" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.299409 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.310448 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" (UID: "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.310487 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data" (OuterVolumeSpecName: "config-data") pod "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" (UID: "d2f98436-ab15-4fb1-91d5-2b1ae63f25ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.311229 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-pzvb2"] Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.366503 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.366615 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.367036 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkng\" (UniqueName: \"kubernetes.io/projected/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-kube-api-access-zbkng\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.367268 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.367349 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.367391 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbhf6\" (UniqueName: \"kubernetes.io/projected/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-kube-api-access-bbhf6\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.367409 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.469224 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.469354 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkng\" (UniqueName: \"kubernetes.io/projected/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-kube-api-access-zbkng\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.469400 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.472985 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.473289 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.487704 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkng\" (UniqueName: \"kubernetes.io/projected/8aa5c436-87aa-44d6-b16e-076b4cca0bd5-kube-api-access-zbkng\") pod \"nova-cell1-conductor-0\" (UID: \"8aa5c436-87aa-44d6-b16e-076b4cca0bd5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:20 crc kubenswrapper[4759]: I1205 00:48:20.558464 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.048364 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.174058 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" path="/var/lib/kubelet/pods/3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b/volumes" Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.260821 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerStarted","Data":"eba0d7a4f2df32e5ff9452fcc0e83e1591ebd907fe3b06891b0091c1ffc6960f"} Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.262333 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.273382 4759 generic.go:334] "Generic (PLEG): container finished" podID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerID="93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385" exitCode=143 Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.273456 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerDied","Data":"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385"} Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.279235 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerName="nova-scheduler-scheduler" containerID="cri-o://4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" gracePeriod=30 Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.279498 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8aa5c436-87aa-44d6-b16e-076b4cca0bd5","Type":"ContainerStarted","Data":"c7adce0510287f6db51badd44493c5611bb80cf81cd3c6bea1b483cdfb03e46a"} Dec 05 00:48:21 crc kubenswrapper[4759]: I1205 00:48:21.294047 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.951363615 podStartE2EDuration="5.294028684s" podCreationTimestamp="2025-12-05 00:48:16 +0000 UTC" firstStartedPulling="2025-12-05 00:48:17.051994942 +0000 UTC m=+1516.267655902" lastFinishedPulling="2025-12-05 00:48:20.394660021 +0000 UTC m=+1519.610320971" observedRunningTime="2025-12-05 00:48:21.283496807 +0000 UTC m=+1520.499157757" watchObservedRunningTime="2025-12-05 00:48:21.294028684 +0000 UTC m=+1520.509689634" Dec 05 00:48:22 crc kubenswrapper[4759]: I1205 00:48:22.289919 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8aa5c436-87aa-44d6-b16e-076b4cca0bd5","Type":"ContainerStarted","Data":"ae8a6c33d190be06032011ebe64f7944166991778492ad2a2c9c119ebec52224"} Dec 05 00:48:22 crc kubenswrapper[4759]: I1205 00:48:22.290266 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:22 crc kubenswrapper[4759]: I1205 00:48:22.312275 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.31225922 podStartE2EDuration="2.31225922s" podCreationTimestamp="2025-12-05 00:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:22.304263905 +0000 UTC m=+1521.519924855" watchObservedRunningTime="2025-12-05 00:48:22.31225922 +0000 UTC m=+1521.527920170" Dec 05 00:48:23 crc kubenswrapper[4759]: E1205 00:48:23.670559 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:48:23 crc kubenswrapper[4759]: E1205 00:48:23.672433 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:48:23 crc kubenswrapper[4759]: E1205 00:48:23.673962 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:48:23 crc kubenswrapper[4759]: E1205 00:48:23.674034 4759 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerName="nova-scheduler-scheduler" Dec 05 00:48:24 crc kubenswrapper[4759]: I1205 00:48:24.697189 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d978555f9-pzvb2" podUID="3e3faf50-9e57-4e57-ba3a-f8b3b2a88b0b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.321514 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5f58p" event={"ID":"bf28a30c-49fe-4e6f-8684-eca499f44133","Type":"ContainerStarted","Data":"3700d31f2a7981c9d7d9609971c4985fab2655b62afbd25b16860029e3190c83"} Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.323759 4759 generic.go:334] "Generic (PLEG): container finished" podID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerID="4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" exitCode=0 Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.323825 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d","Type":"ContainerDied","Data":"4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9"} Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.346811 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-5f58p" podStartSLOduration=2.664973342 podStartE2EDuration="7.346790185s" podCreationTimestamp="2025-12-05 00:48:18 +0000 UTC" firstStartedPulling="2025-12-05 00:48:19.848636087 +0000 UTC m=+1519.064297037" lastFinishedPulling="2025-12-05 00:48:24.53045293 +0000 UTC m=+1523.746113880" observedRunningTime="2025-12-05 00:48:25.334989506 +0000 UTC m=+1524.550650476" watchObservedRunningTime="2025-12-05 00:48:25.346790185 +0000 UTC m=+1524.562451135" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.419810 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.466240 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data\") pod \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.466400 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle\") pod \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.466759 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tsq5\" (UniqueName: \"kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5\") pod \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\" (UID: \"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d\") " Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.474652 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5" (OuterVolumeSpecName: "kube-api-access-2tsq5") pod "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" (UID: "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d"). InnerVolumeSpecName "kube-api-access-2tsq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.502022 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" (UID: "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.515787 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data" (OuterVolumeSpecName: "config-data") pod "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" (UID: "2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.572299 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tsq5\" (UniqueName: \"kubernetes.io/projected/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-kube-api-access-2tsq5\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.572737 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:25 crc kubenswrapper[4759]: I1205 00:48:25.572751 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.194945 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.291070 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data\") pod \"17962c61-12f2-4ea8-b978-ecf42eebb429\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.291144 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle\") pod \"17962c61-12f2-4ea8-b978-ecf42eebb429\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.291237 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs\") pod \"17962c61-12f2-4ea8-b978-ecf42eebb429\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.291364 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlfqn\" (UniqueName: \"kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn\") pod \"17962c61-12f2-4ea8-b978-ecf42eebb429\" (UID: \"17962c61-12f2-4ea8-b978-ecf42eebb429\") " Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.292611 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs" (OuterVolumeSpecName: "logs") pod "17962c61-12f2-4ea8-b978-ecf42eebb429" (UID: "17962c61-12f2-4ea8-b978-ecf42eebb429"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.295719 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn" (OuterVolumeSpecName: "kube-api-access-xlfqn") pod "17962c61-12f2-4ea8-b978-ecf42eebb429" (UID: "17962c61-12f2-4ea8-b978-ecf42eebb429"). InnerVolumeSpecName "kube-api-access-xlfqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.319160 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17962c61-12f2-4ea8-b978-ecf42eebb429" (UID: "17962c61-12f2-4ea8-b978-ecf42eebb429"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.319789 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data" (OuterVolumeSpecName: "config-data") pod "17962c61-12f2-4ea8-b978-ecf42eebb429" (UID: "17962c61-12f2-4ea8-b978-ecf42eebb429"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.336464 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d","Type":"ContainerDied","Data":"690c67458496c86cc8cdf4cd3348f5ba69d669b01e3455ebee0cac4e93577b5b"} Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.336488 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.336521 4759 scope.go:117] "RemoveContainer" containerID="4252a4bce335f6d9b21272cc315a82624a0d9c2028cd3e5995a0a82357e141b9" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.344884 4759 generic.go:334] "Generic (PLEG): container finished" podID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerID="10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0" exitCode=0 Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.345024 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerDied","Data":"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0"} Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.345050 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"17962c61-12f2-4ea8-b978-ecf42eebb429","Type":"ContainerDied","Data":"45f4b8e83818dbc96d6fa6b8e89bbdaaf08ffd8a9411d7f0fc11999323111222"} Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.355638 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.379849 4759 scope.go:117] "RemoveContainer" containerID="10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.394192 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.395582 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.400968 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17962c61-12f2-4ea8-b978-ecf42eebb429-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.401007 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17962c61-12f2-4ea8-b978-ecf42eebb429-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.401018 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlfqn\" (UniqueName: \"kubernetes.io/projected/17962c61-12f2-4ea8-b978-ecf42eebb429-kube-api-access-xlfqn\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.401021 4759 scope.go:117] "RemoveContainer" containerID="93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.414566 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.424158 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.434592 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: E1205 00:48:26.435188 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-log" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435215 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-log" Dec 05 00:48:26 crc kubenswrapper[4759]: E1205 00:48:26.435236 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-api" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435245 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-api" Dec 05 00:48:26 crc kubenswrapper[4759]: E1205 00:48:26.435266 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerName="nova-scheduler-scheduler" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435275 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerName="nova-scheduler-scheduler" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435593 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" containerName="nova-scheduler-scheduler" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435989 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-log" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.436026 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" containerName="nova-api-api" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.437020 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.435603 4759 scope.go:117] "RemoveContainer" containerID="10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0" Dec 05 00:48:26 crc kubenswrapper[4759]: E1205 00:48:26.439870 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0\": container with ID starting with 10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0 not found: ID does not exist" containerID="10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.439918 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0"} err="failed to get container status \"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0\": rpc error: code = NotFound desc = could not find container \"10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0\": container with ID starting with 10aa2129e238145cb5c9a7d84acfc5775d54572ed308cce134866735fe06fca0 not found: ID does not exist" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.439947 4759 scope.go:117] "RemoveContainer" containerID="93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.440042 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 00:48:26 crc kubenswrapper[4759]: E1205 00:48:26.441621 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385\": container with ID starting with 93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385 not found: ID does not exist" containerID="93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.441654 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385"} err="failed to get container status \"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385\": rpc error: code = NotFound desc = could not find container \"93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385\": container with ID starting with 93b3b10896b46edcba96525482a85ce07821fdce8443ab4d0161387499126385 not found: ID does not exist" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.446857 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.463359 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.469977 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.471837 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.473904 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.477803 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502359 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502400 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502421 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxj4\" (UniqueName: \"kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502443 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502469 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.502515 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sfgm\" (UniqueName: \"kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.604831 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.604903 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.604928 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxj4\" (UniqueName: \"kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.604951 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.604979 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.605116 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sfgm\" (UniqueName: \"kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.605892 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.605960 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.609179 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.609745 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.610475 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.617630 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.623946 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sfgm\" (UniqueName: \"kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm\") pod \"nova-api-0\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " pod="openstack/nova-api-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.626806 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxj4\" (UniqueName: \"kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4\") pod \"nova-scheduler-0\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.757197 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:48:26 crc kubenswrapper[4759]: I1205 00:48:26.791967 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.173615 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17962c61-12f2-4ea8-b978-ecf42eebb429" path="/var/lib/kubelet/pods/17962c61-12f2-4ea8-b978-ecf42eebb429/volumes" Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.174605 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d" path="/var/lib/kubelet/pods/2a80abf5-1bae-43c5-b9f7-9bd20c9fee2d/volumes" Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.275841 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.357600 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aeed76aa-514a-433f-bce2-5d97d35e8534","Type":"ContainerStarted","Data":"99cd2946562b108777f8910239fafe8b80d1d6763ed1a91912e81ec145b34f7a"} Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.359330 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5f58p" event={"ID":"bf28a30c-49fe-4e6f-8684-eca499f44133","Type":"ContainerDied","Data":"3700d31f2a7981c9d7d9609971c4985fab2655b62afbd25b16860029e3190c83"} Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.359281 4759 generic.go:334] "Generic (PLEG): container finished" podID="bf28a30c-49fe-4e6f-8684-eca499f44133" containerID="3700d31f2a7981c9d7d9609971c4985fab2655b62afbd25b16860029e3190c83" exitCode=0 Dec 05 00:48:27 crc kubenswrapper[4759]: I1205 00:48:27.362616 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:27 crc kubenswrapper[4759]: W1205 00:48:27.363795 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1dee261_059a_43fb_9716_92e289f4cd8f.slice/crio-576dafc30a2b764dd1710c77897fd39b730951ce1b00500d59d02c86539e2d6d WatchSource:0}: Error finding container 576dafc30a2b764dd1710c77897fd39b730951ce1b00500d59d02c86539e2d6d: Status 404 returned error can't find the container with id 576dafc30a2b764dd1710c77897fd39b730951ce1b00500d59d02c86539e2d6d Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.381491 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aeed76aa-514a-433f-bce2-5d97d35e8534","Type":"ContainerStarted","Data":"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113"} Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.385151 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerStarted","Data":"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72"} Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.385292 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerStarted","Data":"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f"} Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.385351 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerStarted","Data":"576dafc30a2b764dd1710c77897fd39b730951ce1b00500d59d02c86539e2d6d"} Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.407053 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4069815869999998 podStartE2EDuration="2.406981587s" podCreationTimestamp="2025-12-05 00:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:28.401829961 +0000 UTC m=+1527.617490911" watchObservedRunningTime="2025-12-05 00:48:28.406981587 +0000 UTC m=+1527.622642547" Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.435938 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.435918003 podStartE2EDuration="2.435918003s" podCreationTimestamp="2025-12-05 00:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:28.423732556 +0000 UTC m=+1527.639393506" watchObservedRunningTime="2025-12-05 00:48:28.435918003 +0000 UTC m=+1527.651578963" Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.847290 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.953839 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data\") pod \"bf28a30c-49fe-4e6f-8684-eca499f44133\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.953971 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jshwn\" (UniqueName: \"kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn\") pod \"bf28a30c-49fe-4e6f-8684-eca499f44133\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.954061 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle\") pod \"bf28a30c-49fe-4e6f-8684-eca499f44133\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.954206 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts\") pod \"bf28a30c-49fe-4e6f-8684-eca499f44133\" (UID: \"bf28a30c-49fe-4e6f-8684-eca499f44133\") " Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.960244 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts" (OuterVolumeSpecName: "scripts") pod "bf28a30c-49fe-4e6f-8684-eca499f44133" (UID: "bf28a30c-49fe-4e6f-8684-eca499f44133"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:28 crc kubenswrapper[4759]: I1205 00:48:28.970234 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn" (OuterVolumeSpecName: "kube-api-access-jshwn") pod "bf28a30c-49fe-4e6f-8684-eca499f44133" (UID: "bf28a30c-49fe-4e6f-8684-eca499f44133"). InnerVolumeSpecName "kube-api-access-jshwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.006170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf28a30c-49fe-4e6f-8684-eca499f44133" (UID: "bf28a30c-49fe-4e6f-8684-eca499f44133"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.021604 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data" (OuterVolumeSpecName: "config-data") pod "bf28a30c-49fe-4e6f-8684-eca499f44133" (UID: "bf28a30c-49fe-4e6f-8684-eca499f44133"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.057029 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.057064 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jshwn\" (UniqueName: \"kubernetes.io/projected/bf28a30c-49fe-4e6f-8684-eca499f44133-kube-api-access-jshwn\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.057078 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.057089 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf28a30c-49fe-4e6f-8684-eca499f44133-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.156487 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:48:29 crc kubenswrapper[4759]: E1205 00:48:29.157014 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.401446 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5f58p" event={"ID":"bf28a30c-49fe-4e6f-8684-eca499f44133","Type":"ContainerDied","Data":"2a8037f354acf8cf2fb2d5f3ddb1ccfdae6b461e70675557474c54ffd65b9c11"} Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.401520 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8037f354acf8cf2fb2d5f3ddb1ccfdae6b461e70675557474c54ffd65b9c11" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.401554 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5f58p" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.597090 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 00:48:29 crc kubenswrapper[4759]: E1205 00:48:29.602093 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf28a30c-49fe-4e6f-8684-eca499f44133" containerName="aodh-db-sync" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.602127 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf28a30c-49fe-4e6f-8684-eca499f44133" containerName="aodh-db-sync" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.602471 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf28a30c-49fe-4e6f-8684-eca499f44133" containerName="aodh-db-sync" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.604841 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.609795 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.609799 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mr6x2" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.610228 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.626421 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.669241 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.669375 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.669435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5q8g\" (UniqueName: \"kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.669471 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.771394 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.771742 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5q8g\" (UniqueName: \"kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.771785 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.771941 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.776687 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.776964 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.784447 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.793027 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5q8g\" (UniqueName: \"kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g\") pod \"aodh-0\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " pod="openstack/aodh-0" Dec 05 00:48:29 crc kubenswrapper[4759]: I1205 00:48:29.928124 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:48:30 crc kubenswrapper[4759]: I1205 00:48:30.414560 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:48:30 crc kubenswrapper[4759]: I1205 00:48:30.594039 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 00:48:31 crc kubenswrapper[4759]: I1205 00:48:31.437831 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerStarted","Data":"df8a12e8047b8668c5d9a7627e03718bf1d339e6c26c3b2f657b8988b07bb38f"} Dec 05 00:48:31 crc kubenswrapper[4759]: I1205 00:48:31.440684 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerStarted","Data":"219221399cfc497fcfb435bd965c51571346b558a343555eaed6b52c1be3739a"} Dec 05 00:48:31 crc kubenswrapper[4759]: I1205 00:48:31.758290 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.146527 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.147229 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-central-agent" containerID="cri-o://9bca42c9fbcb33a3a1562e999f2308ae4f0ac371dc7da7e17be6d6af558ed7c0" gracePeriod=30 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.148253 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="proxy-httpd" containerID="cri-o://eba0d7a4f2df32e5ff9452fcc0e83e1591ebd907fe3b06891b0091c1ffc6960f" gracePeriod=30 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.148420 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-notification-agent" containerID="cri-o://045cbf82ddbb75a9caf5c5a8a1ee7cb4b38e18c7610bf0dc8539e4a6be98f56c" gracePeriod=30 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.148444 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="sg-core" containerID="cri-o://37efa3277bf835ca8e9c882acc6c888664033213e07c04b2f8e629679366ef8c" gracePeriod=30 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.158218 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.225:3000/\": EOF" Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.451222 4759 generic.go:334] "Generic (PLEG): container finished" podID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerID="eba0d7a4f2df32e5ff9452fcc0e83e1591ebd907fe3b06891b0091c1ffc6960f" exitCode=0 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.451684 4759 generic.go:334] "Generic (PLEG): container finished" podID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerID="37efa3277bf835ca8e9c882acc6c888664033213e07c04b2f8e629679366ef8c" exitCode=2 Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.451299 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerDied","Data":"eba0d7a4f2df32e5ff9452fcc0e83e1591ebd907fe3b06891b0091c1ffc6960f"} Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.451733 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerDied","Data":"37efa3277bf835ca8e9c882acc6c888664033213e07c04b2f8e629679366ef8c"} Dec 05 00:48:32 crc kubenswrapper[4759]: I1205 00:48:32.919783 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 00:48:33 crc kubenswrapper[4759]: I1205 00:48:33.465466 4759 generic.go:334] "Generic (PLEG): container finished" podID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerID="9bca42c9fbcb33a3a1562e999f2308ae4f0ac371dc7da7e17be6d6af558ed7c0" exitCode=0 Dec 05 00:48:33 crc kubenswrapper[4759]: I1205 00:48:33.465504 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerDied","Data":"9bca42c9fbcb33a3a1562e999f2308ae4f0ac371dc7da7e17be6d6af558ed7c0"} Dec 05 00:48:33 crc kubenswrapper[4759]: I1205 00:48:33.469731 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerStarted","Data":"d399944d0148d45787a61fd37d64736aaa479bd239455599558cad800f47ee8d"} Dec 05 00:48:34 crc kubenswrapper[4759]: I1205 00:48:34.482267 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerStarted","Data":"335609a57ad8d4769cf24f6194a490594ac72807791bc83339c83d385c6fdbac"} Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.509805 4759 generic.go:334] "Generic (PLEG): container finished" podID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerID="045cbf82ddbb75a9caf5c5a8a1ee7cb4b38e18c7610bf0dc8539e4a6be98f56c" exitCode=0 Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.510006 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerDied","Data":"045cbf82ddbb75a9caf5c5a8a1ee7cb4b38e18c7610bf0dc8539e4a6be98f56c"} Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.510373 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d279a1-2614-44da-9c33-19c4ecd7e1d4","Type":"ContainerDied","Data":"21de9b67f5b3613957c9741610ab464c8df1f195c60195c3a81ef6ac7ee9be91"} Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.510395 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21de9b67f5b3613957c9741610ab464c8df1f195c60195c3a81ef6ac7ee9be91" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.589797 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.680298 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.680503 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.680597 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.680747 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.680808 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.681015 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.681094 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.681140 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vqkw\" (UniqueName: \"kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.681177 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml\") pod \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\" (UID: \"38d279a1-2614-44da-9c33-19c4ecd7e1d4\") " Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.681170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.682196 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.682220 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d279a1-2614-44da-9c33-19c4ecd7e1d4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.684804 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw" (OuterVolumeSpecName: "kube-api-access-9vqkw") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "kube-api-access-9vqkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.705878 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts" (OuterVolumeSpecName: "scripts") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.714882 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.758108 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.758087 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.787534 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.787573 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vqkw\" (UniqueName: \"kubernetes.io/projected/38d279a1-2614-44da-9c33-19c4ecd7e1d4-kube-api-access-9vqkw\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.787617 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.787630 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.792794 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.793162 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.798457 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.842039 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.858336 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data" (OuterVolumeSpecName: "config-data") pod "38d279a1-2614-44da-9c33-19c4ecd7e1d4" (UID: "38d279a1-2614-44da-9c33-19c4ecd7e1d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.889186 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:36 crc kubenswrapper[4759]: I1205 00:48:36.889247 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d279a1-2614-44da-9c33-19c4ecd7e1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.524222 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerStarted","Data":"f8fe1cc148e8ef06846e04effd3c59424ed1a467d4af272abd79ffe3f2aed485"} Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.524259 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.525481 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-api" containerID="cri-o://df8a12e8047b8668c5d9a7627e03718bf1d339e6c26c3b2f657b8988b07bb38f" gracePeriod=30 Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.525570 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-notifier" containerID="cri-o://335609a57ad8d4769cf24f6194a490594ac72807791bc83339c83d385c6fdbac" gracePeriod=30 Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.525578 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-evaluator" containerID="cri-o://d399944d0148d45787a61fd37d64736aaa479bd239455599558cad800f47ee8d" gracePeriod=30 Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.525707 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-listener" containerID="cri-o://f8fe1cc148e8ef06846e04effd3c59424ed1a467d4af272abd79ffe3f2aed485" gracePeriod=30 Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.563068 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.569383 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.575112803 podStartE2EDuration="8.569359756s" podCreationTimestamp="2025-12-05 00:48:29 +0000 UTC" firstStartedPulling="2025-12-05 00:48:30.423716186 +0000 UTC m=+1529.639377136" lastFinishedPulling="2025-12-05 00:48:36.417963129 +0000 UTC m=+1535.633624089" observedRunningTime="2025-12-05 00:48:37.561133506 +0000 UTC m=+1536.776794456" watchObservedRunningTime="2025-12-05 00:48:37.569359756 +0000 UTC m=+1536.785020706" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.593845 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.634408 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.674541 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:37 crc kubenswrapper[4759]: E1205 00:48:37.675056 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-notification-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675076 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-notification-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: E1205 00:48:37.675087 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="sg-core" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675094 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="sg-core" Dec 05 00:48:37 crc kubenswrapper[4759]: E1205 00:48:37.675112 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-central-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675118 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-central-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: E1205 00:48:37.675150 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="proxy-httpd" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675155 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="proxy-httpd" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675356 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="proxy-httpd" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675370 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-notification-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675389 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="sg-core" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.675401 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" containerName="ceilometer-central-agent" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.677209 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.680778 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.680940 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.681050 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.686743 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.835701 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909460 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909562 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909604 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tsl9\" (UniqueName: \"kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909675 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909727 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909817 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909898 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.909999 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:37 crc kubenswrapper[4759]: I1205 00:48:37.952486 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011506 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011586 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011614 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tsl9\" (UniqueName: \"kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011655 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011686 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011744 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011798 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.011830 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.012295 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.012737 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.021875 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.022249 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.022680 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.023191 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.031504 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.040995 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tsl9\" (UniqueName: \"kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9\") pod \"ceilometer-0\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.234057 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.541760 4759 generic.go:334] "Generic (PLEG): container finished" podID="de04d4a9-3563-4423-8185-089c35559589" containerID="d399944d0148d45787a61fd37d64736aaa479bd239455599558cad800f47ee8d" exitCode=0 Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.541791 4759 generic.go:334] "Generic (PLEG): container finished" podID="de04d4a9-3563-4423-8185-089c35559589" containerID="df8a12e8047b8668c5d9a7627e03718bf1d339e6c26c3b2f657b8988b07bb38f" exitCode=0 Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.542635 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerDied","Data":"d399944d0148d45787a61fd37d64736aaa479bd239455599558cad800f47ee8d"} Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.542661 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerDied","Data":"df8a12e8047b8668c5d9a7627e03718bf1d339e6c26c3b2f657b8988b07bb38f"} Dec 05 00:48:38 crc kubenswrapper[4759]: I1205 00:48:38.734006 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:38 crc kubenswrapper[4759]: W1205 00:48:38.736725 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa11fcd9_4d8e_4216_93d4_1e5deb59d90b.slice/crio-bb17f2c4b9b0019d18fb270f3ca2380e518054a0fc4912cae9a3135617aa0934 WatchSource:0}: Error finding container bb17f2c4b9b0019d18fb270f3ca2380e518054a0fc4912cae9a3135617aa0934: Status 404 returned error can't find the container with id bb17f2c4b9b0019d18fb270f3ca2380e518054a0fc4912cae9a3135617aa0934 Dec 05 00:48:39 crc kubenswrapper[4759]: I1205 00:48:39.168527 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d279a1-2614-44da-9c33-19c4ecd7e1d4" path="/var/lib/kubelet/pods/38d279a1-2614-44da-9c33-19c4ecd7e1d4/volumes" Dec 05 00:48:39 crc kubenswrapper[4759]: I1205 00:48:39.560393 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerStarted","Data":"64b833c089946c8ba68c40e7067d85e0046553f567137203f00220bb3212c3a0"} Dec 05 00:48:39 crc kubenswrapper[4759]: I1205 00:48:39.560632 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerStarted","Data":"bb17f2c4b9b0019d18fb270f3ca2380e518054a0fc4912cae9a3135617aa0934"} Dec 05 00:48:39 crc kubenswrapper[4759]: I1205 00:48:39.563441 4759 generic.go:334] "Generic (PLEG): container finished" podID="de04d4a9-3563-4423-8185-089c35559589" containerID="335609a57ad8d4769cf24f6194a490594ac72807791bc83339c83d385c6fdbac" exitCode=0 Dec 05 00:48:39 crc kubenswrapper[4759]: I1205 00:48:39.563465 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerDied","Data":"335609a57ad8d4769cf24f6194a490594ac72807791bc83339c83d385c6fdbac"} Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.512283 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.514735 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.528981 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.584520 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerStarted","Data":"03c792b96254eed5ae38fc99d0d841e19f8167a883414c9aa7d4a4c6935ead49"} Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.664935 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.665202 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.665263 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxd2\" (UniqueName: \"kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.767441 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.767510 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxd2\" (UniqueName: \"kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.767650 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.767942 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.768202 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.788921 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxd2\" (UniqueName: \"kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2\") pod \"certified-operators-2k6tq\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:40 crc kubenswrapper[4759]: I1205 00:48:40.843445 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:41 crc kubenswrapper[4759]: I1205 00:48:41.535143 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:41 crc kubenswrapper[4759]: I1205 00:48:41.612632 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerStarted","Data":"19435c6fff8a1e86fb2713042249cac4b26254b199fee1724e657e3236110582"} Dec 05 00:48:41 crc kubenswrapper[4759]: I1205 00:48:41.615271 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerStarted","Data":"8175f87823494f1d6b67864ccaa76f460f11c29e1b9b302e37ecb60a040e7ca0"} Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.156317 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:48:42 crc kubenswrapper[4759]: E1205 00:48:42.156813 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.628207 4759 generic.go:334] "Generic (PLEG): container finished" podID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerID="baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb" exitCode=0 Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.628333 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerDied","Data":"baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb"} Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.635507 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerStarted","Data":"23748e549b58e449d135cf7f10c664ef30c47d43379cbd0c17af4ef30c0e853b"} Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.636627 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:48:42 crc kubenswrapper[4759]: I1205 00:48:42.684433 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.757787349 podStartE2EDuration="5.684412979s" podCreationTimestamp="2025-12-05 00:48:37 +0000 UTC" firstStartedPulling="2025-12-05 00:48:38.740410464 +0000 UTC m=+1537.956071404" lastFinishedPulling="2025-12-05 00:48:41.667036074 +0000 UTC m=+1540.882697034" observedRunningTime="2025-12-05 00:48:42.673706687 +0000 UTC m=+1541.889367637" watchObservedRunningTime="2025-12-05 00:48:42.684412979 +0000 UTC m=+1541.900073939" Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.664661 4759 generic.go:334] "Generic (PLEG): container finished" podID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerID="c6c20032a68e4777e37abb6e04d9ce16aba62e8b83af9c129f0017656c2878ca" exitCode=137 Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.664761 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerDied","Data":"c6c20032a68e4777e37abb6e04d9ce16aba62e8b83af9c129f0017656c2878ca"} Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.667673 4759 generic.go:334] "Generic (PLEG): container finished" podID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" containerID="0299f3d4052f344401bbf464bf2774772bba2187121f10b20ca6953fa77ebfc5" exitCode=137 Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.667780 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aa2ac07a-67f0-4c83-a96b-8f745f96d18d","Type":"ContainerDied","Data":"0299f3d4052f344401bbf464bf2774772bba2187121f10b20ca6953fa77ebfc5"} Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.669050 4759 generic.go:334] "Generic (PLEG): container finished" podID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerID="585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38" exitCode=0 Dec 05 00:48:44 crc kubenswrapper[4759]: I1205 00:48:44.669094 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerDied","Data":"585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38"} Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.712586 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerStarted","Data":"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5"} Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.736225 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.736606 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"007ffadd-ee54-4e09-9ff3-e57beb40f0af","Type":"ContainerDied","Data":"eda421e39e788977dc71f065c502a179d72800d19b8a65a36a77e42479a6ffc8"} Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.736634 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eda421e39e788977dc71f065c502a179d72800d19b8a65a36a77e42479a6ffc8" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.737491 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2k6tq" podStartSLOduration=3.292171001 podStartE2EDuration="5.737476656s" podCreationTimestamp="2025-12-05 00:48:40 +0000 UTC" firstStartedPulling="2025-12-05 00:48:42.630124123 +0000 UTC m=+1541.845785073" lastFinishedPulling="2025-12-05 00:48:45.075429748 +0000 UTC m=+1544.291090728" observedRunningTime="2025-12-05 00:48:45.729588644 +0000 UTC m=+1544.945249594" watchObservedRunningTime="2025-12-05 00:48:45.737476656 +0000 UTC m=+1544.953137606" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.737663 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.741732 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aa2ac07a-67f0-4c83-a96b-8f745f96d18d","Type":"ContainerDied","Data":"d85ac19aa234f2a52753ee3894af2c05ccee8c31c6f82165e7ebd41e3d2aa212"} Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.741774 4759 scope.go:117] "RemoveContainer" containerID="0299f3d4052f344401bbf464bf2774772bba2187121f10b20ca6953fa77ebfc5" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.741812 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.800776 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs\") pod \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.800951 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data\") pod \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.801007 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm7hf\" (UniqueName: \"kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf\") pod \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.801032 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxsbt\" (UniqueName: \"kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt\") pod \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.801057 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle\") pod \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\" (UID: \"007ffadd-ee54-4e09-9ff3-e57beb40f0af\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.801133 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data\") pod \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.801265 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle\") pod \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\" (UID: \"aa2ac07a-67f0-4c83-a96b-8f745f96d18d\") " Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.804023 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs" (OuterVolumeSpecName: "logs") pod "007ffadd-ee54-4e09-9ff3-e57beb40f0af" (UID: "007ffadd-ee54-4e09-9ff3-e57beb40f0af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.809532 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf" (OuterVolumeSpecName: "kube-api-access-qm7hf") pod "007ffadd-ee54-4e09-9ff3-e57beb40f0af" (UID: "007ffadd-ee54-4e09-9ff3-e57beb40f0af"). InnerVolumeSpecName "kube-api-access-qm7hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.810147 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt" (OuterVolumeSpecName: "kube-api-access-fxsbt") pod "aa2ac07a-67f0-4c83-a96b-8f745f96d18d" (UID: "aa2ac07a-67f0-4c83-a96b-8f745f96d18d"). InnerVolumeSpecName "kube-api-access-fxsbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.859849 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa2ac07a-67f0-4c83-a96b-8f745f96d18d" (UID: "aa2ac07a-67f0-4c83-a96b-8f745f96d18d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.863677 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data" (OuterVolumeSpecName: "config-data") pod "007ffadd-ee54-4e09-9ff3-e57beb40f0af" (UID: "007ffadd-ee54-4e09-9ff3-e57beb40f0af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.864762 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data" (OuterVolumeSpecName: "config-data") pod "aa2ac07a-67f0-4c83-a96b-8f745f96d18d" (UID: "aa2ac07a-67f0-4c83-a96b-8f745f96d18d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.870497 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007ffadd-ee54-4e09-9ff3-e57beb40f0af" (UID: "007ffadd-ee54-4e09-9ff3-e57beb40f0af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905582 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905639 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007ffadd-ee54-4e09-9ff3-e57beb40f0af-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905651 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905679 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm7hf\" (UniqueName: \"kubernetes.io/projected/007ffadd-ee54-4e09-9ff3-e57beb40f0af-kube-api-access-qm7hf\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905693 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxsbt\" (UniqueName: \"kubernetes.io/projected/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-kube-api-access-fxsbt\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905703 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007ffadd-ee54-4e09-9ff3-e57beb40f0af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:45 crc kubenswrapper[4759]: I1205 00:48:45.905714 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2ac07a-67f0-4c83-a96b-8f745f96d18d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.072083 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.084682 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.095942 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: E1205 00:48:46.096455 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096476 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 00:48:46 crc kubenswrapper[4759]: E1205 00:48:46.096515 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-metadata" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096524 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-metadata" Dec 05 00:48:46 crc kubenswrapper[4759]: E1205 00:48:46.096556 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-log" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096566 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-log" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096864 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-log" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096890 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" containerName="nova-metadata-metadata" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.096909 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.097813 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.099237 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.099633 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.100169 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.109428 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.109489 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.109522 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.109671 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.109707 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgq5l\" (UniqueName: \"kubernetes.io/projected/7229698f-fca8-46ac-b297-59fd47d15e13-kube-api-access-tgq5l\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.114021 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.212179 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.212232 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.212255 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.212359 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.212378 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgq5l\" (UniqueName: \"kubernetes.io/projected/7229698f-fca8-46ac-b297-59fd47d15e13-kube-api-access-tgq5l\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.219703 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.219733 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.219809 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.225050 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7229698f-fca8-46ac-b297-59fd47d15e13-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.233981 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgq5l\" (UniqueName: \"kubernetes.io/projected/7229698f-fca8-46ac-b297-59fd47d15e13-kube-api-access-tgq5l\") pod \"nova-cell1-novncproxy-0\" (UID: \"7229698f-fca8-46ac-b297-59fd47d15e13\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.328588 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.760583 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.784963 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: W1205 00:48:46.786915 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7229698f_fca8_46ac_b297_59fd47d15e13.slice/crio-0daa27916227e602f196a9294b4a022c3709c5cf590fc87b430fb3774f0bd19f WatchSource:0}: Error finding container 0daa27916227e602f196a9294b4a022c3709c5cf590fc87b430fb3774f0bd19f: Status 404 returned error can't find the container with id 0daa27916227e602f196a9294b4a022c3709c5cf590fc87b430fb3774f0bd19f Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.799219 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.799816 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.816752 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.816969 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.827466 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.857685 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.872484 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.875036 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.877101 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.893105 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.912343 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.941516 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.941645 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.941789 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.941811 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:46 crc kubenswrapper[4759]: I1205 00:48:46.942115 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szvl\" (UniqueName: \"kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.043901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.044033 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.044211 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.044233 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.044355 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szvl\" (UniqueName: \"kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.045058 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.048780 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.049764 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.051223 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.064797 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szvl\" (UniqueName: \"kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl\") pod \"nova-metadata-0\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.174913 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007ffadd-ee54-4e09-9ff3-e57beb40f0af" path="/var/lib/kubelet/pods/007ffadd-ee54-4e09-9ff3-e57beb40f0af/volumes" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.176395 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2ac07a-67f0-4c83-a96b-8f745f96d18d" path="/var/lib/kubelet/pods/aa2ac07a-67f0-4c83-a96b-8f745f96d18d/volumes" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.216073 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.777959 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7229698f-fca8-46ac-b297-59fd47d15e13","Type":"ContainerStarted","Data":"8dcb0dc43dce4d197090cb1d0bcf9c47b4bfb14a148c1b568d2d425979170f90"} Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.778299 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7229698f-fca8-46ac-b297-59fd47d15e13","Type":"ContainerStarted","Data":"0daa27916227e602f196a9294b4a022c3709c5cf590fc87b430fb3774f0bd19f"} Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.778340 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.787623 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.836679 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:48:47 crc kubenswrapper[4759]: I1205 00:48:47.843748 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8437243319999999 podStartE2EDuration="1.843724332s" podCreationTimestamp="2025-12-05 00:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:47.818568448 +0000 UTC m=+1547.034229418" watchObservedRunningTime="2025-12-05 00:48:47.843724332 +0000 UTC m=+1547.059385292" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.017031 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.019111 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.060454 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.177953 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.178005 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.178024 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.178045 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.178097 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.178176 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82x8\" (UniqueName: \"kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292506 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82x8\" (UniqueName: \"kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292612 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292642 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292661 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292679 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.292723 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.293625 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.294938 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.294961 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.295616 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.295804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.315836 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82x8\" (UniqueName: \"kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8\") pod \"dnsmasq-dns-6d99f6bc7f-x7rsx\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.389075 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.802984 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerStarted","Data":"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d"} Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.803269 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerStarted","Data":"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef"} Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.803281 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerStarted","Data":"1d14db1bd89ccbd976ebfe41aa641faaabbfa74982d9b69f8ae217872e15f2b9"} Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.832852 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.832833206 podStartE2EDuration="2.832833206s" podCreationTimestamp="2025-12-05 00:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:48.830159551 +0000 UTC m=+1548.045820501" watchObservedRunningTime="2025-12-05 00:48:48.832833206 +0000 UTC m=+1548.048494156" Dec 05 00:48:48 crc kubenswrapper[4759]: I1205 00:48:48.952163 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:48:49 crc kubenswrapper[4759]: I1205 00:48:49.811885 4759 generic.go:334] "Generic (PLEG): container finished" podID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerID="a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016" exitCode=0 Dec 05 00:48:49 crc kubenswrapper[4759]: I1205 00:48:49.811990 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" event={"ID":"c361255a-5c87-4fcb-81a0-e5160580cc33","Type":"ContainerDied","Data":"a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016"} Dec 05 00:48:49 crc kubenswrapper[4759]: I1205 00:48:49.812337 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" event={"ID":"c361255a-5c87-4fcb-81a0-e5160580cc33","Type":"ContainerStarted","Data":"2d359aead8b6d187e612d5ff30b7f687bd4a1f34e82cc85da4c4bc7a467ca8d6"} Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.234711 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.235418 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-central-agent" containerID="cri-o://64b833c089946c8ba68c40e7067d85e0046553f567137203f00220bb3212c3a0" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.235895 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="proxy-httpd" containerID="cri-o://23748e549b58e449d135cf7f10c664ef30c47d43379cbd0c17af4ef30c0e853b" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.236008 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="sg-core" containerID="cri-o://19435c6fff8a1e86fb2713042249cac4b26254b199fee1724e657e3236110582" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.236115 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-notification-agent" containerID="cri-o://03c792b96254eed5ae38fc99d0d841e19f8167a883414c9aa7d4a4c6935ead49" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.586335 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824124 4759 generic.go:334] "Generic (PLEG): container finished" podID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerID="23748e549b58e449d135cf7f10c664ef30c47d43379cbd0c17af4ef30c0e853b" exitCode=0 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824446 4759 generic.go:334] "Generic (PLEG): container finished" podID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerID="19435c6fff8a1e86fb2713042249cac4b26254b199fee1724e657e3236110582" exitCode=2 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824457 4759 generic.go:334] "Generic (PLEG): container finished" podID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerID="64b833c089946c8ba68c40e7067d85e0046553f567137203f00220bb3212c3a0" exitCode=0 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824207 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerDied","Data":"23748e549b58e449d135cf7f10c664ef30c47d43379cbd0c17af4ef30c0e853b"} Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824510 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerDied","Data":"19435c6fff8a1e86fb2713042249cac4b26254b199fee1724e657e3236110582"} Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.824521 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerDied","Data":"64b833c089946c8ba68c40e7067d85e0046553f567137203f00220bb3212c3a0"} Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.826958 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" event={"ID":"c361255a-5c87-4fcb-81a0-e5160580cc33","Type":"ContainerStarted","Data":"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b"} Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.827040 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-log" containerID="cri-o://cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.827159 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-api" containerID="cri-o://f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72" gracePeriod=30 Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.853373 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.853417 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.926076 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:50 crc kubenswrapper[4759]: I1205 00:48:50.970554 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" podStartSLOduration=3.97053617 podStartE2EDuration="3.97053617s" podCreationTimestamp="2025-12-05 00:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:50.857670534 +0000 UTC m=+1550.073331484" watchObservedRunningTime="2025-12-05 00:48:50.97053617 +0000 UTC m=+1550.186197120" Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.328752 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.846357 4759 generic.go:334] "Generic (PLEG): container finished" podID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerID="03c792b96254eed5ae38fc99d0d841e19f8167a883414c9aa7d4a4c6935ead49" exitCode=0 Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.846560 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerDied","Data":"03c792b96254eed5ae38fc99d0d841e19f8167a883414c9aa7d4a4c6935ead49"} Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.848836 4759 generic.go:334] "Generic (PLEG): container finished" podID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerID="cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f" exitCode=143 Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.849487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerDied","Data":"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f"} Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.849521 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.900973 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:51 crc kubenswrapper[4759]: I1205 00:48:51.979924 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.035629 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.055205 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.056495 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.056556 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.056992 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.057328 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.057417 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.057460 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.057496 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tsl9\" (UniqueName: \"kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9\") pod \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\" (UID: \"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b\") " Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.058683 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.058795 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.079084 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts" (OuterVolumeSpecName: "scripts") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.087503 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9" (OuterVolumeSpecName: "kube-api-access-4tsl9") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "kube-api-access-4tsl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.111577 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.161039 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.161074 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.161086 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.161100 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.161114 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tsl9\" (UniqueName: \"kubernetes.io/projected/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-kube-api-access-4tsl9\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.180633 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.181433 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.217013 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.217405 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.221505 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data" (OuterVolumeSpecName: "config-data") pod "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" (UID: "fa11fcd9-4d8e-4216-93d4-1e5deb59d90b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.266698 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.266784 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.266800 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.861225 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa11fcd9-4d8e-4216-93d4-1e5deb59d90b","Type":"ContainerDied","Data":"bb17f2c4b9b0019d18fb270f3ca2380e518054a0fc4912cae9a3135617aa0934"} Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.861272 4759 scope.go:117] "RemoveContainer" containerID="23748e549b58e449d135cf7f10c664ef30c47d43379cbd0c17af4ef30c0e853b" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.861333 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.903575 4759 scope.go:117] "RemoveContainer" containerID="19435c6fff8a1e86fb2713042249cac4b26254b199fee1724e657e3236110582" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.906242 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.920217 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.929952 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:52 crc kubenswrapper[4759]: E1205 00:48:52.930413 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-central-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930431 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-central-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: E1205 00:48:52.930444 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="proxy-httpd" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930451 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="proxy-httpd" Dec 05 00:48:52 crc kubenswrapper[4759]: E1205 00:48:52.930485 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-notification-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930491 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-notification-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: E1205 00:48:52.930510 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="sg-core" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930516 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="sg-core" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930697 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-notification-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930717 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="ceilometer-central-agent" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930731 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="proxy-httpd" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.930746 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" containerName="sg-core" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.933829 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.937289 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.937441 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.937788 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.944646 4759 scope.go:117] "RemoveContainer" containerID="03c792b96254eed5ae38fc99d0d841e19f8167a883414c9aa7d4a4c6935ead49" Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.955937 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:52 crc kubenswrapper[4759]: I1205 00:48:52.982023 4759 scope.go:117] "RemoveContainer" containerID="64b833c089946c8ba68c40e7067d85e0046553f567137203f00220bb3212c3a0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127400 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127483 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127522 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127625 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127686 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127731 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127816 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.127851 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.175863 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa11fcd9-4d8e-4216-93d4-1e5deb59d90b" path="/var/lib/kubelet/pods/fa11fcd9-4d8e-4216-93d4-1e5deb59d90b/volumes" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.235549 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.235614 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.235640 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.236490 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.236582 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.236629 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.236675 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.236706 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.237674 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.237836 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.244747 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.245955 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.250349 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.250507 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.256439 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.277852 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6\") pod \"ceilometer-0\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.563770 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:48:53 crc kubenswrapper[4759]: W1205 00:48:53.894819 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a0a1242_dc05_4127_b052_94dec63f8703.slice/crio-f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5 WatchSource:0}: Error finding container f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5: Status 404 returned error can't find the container with id f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5 Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.901757 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2k6tq" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="registry-server" containerID="cri-o://41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5" gracePeriod=2 Dec 05 00:48:53 crc kubenswrapper[4759]: I1205 00:48:53.916341 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.508625 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.568917 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.679078 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data\") pod \"d1dee261-059a-43fb-9716-92e289f4cd8f\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.679349 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sfgm\" (UniqueName: \"kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm\") pod \"d1dee261-059a-43fb-9716-92e289f4cd8f\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.679512 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities\") pod \"8f0eba45-0a26-4650-9878-4b59152d2fbb\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.679824 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdxd2\" (UniqueName: \"kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2\") pod \"8f0eba45-0a26-4650-9878-4b59152d2fbb\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.680003 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle\") pod \"d1dee261-059a-43fb-9716-92e289f4cd8f\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.680142 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content\") pod \"8f0eba45-0a26-4650-9878-4b59152d2fbb\" (UID: \"8f0eba45-0a26-4650-9878-4b59152d2fbb\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.680164 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs\") pod \"d1dee261-059a-43fb-9716-92e289f4cd8f\" (UID: \"d1dee261-059a-43fb-9716-92e289f4cd8f\") " Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.681602 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities" (OuterVolumeSpecName: "utilities") pod "8f0eba45-0a26-4650-9878-4b59152d2fbb" (UID: "8f0eba45-0a26-4650-9878-4b59152d2fbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.682004 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs" (OuterVolumeSpecName: "logs") pod "d1dee261-059a-43fb-9716-92e289f4cd8f" (UID: "d1dee261-059a-43fb-9716-92e289f4cd8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.685846 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm" (OuterVolumeSpecName: "kube-api-access-6sfgm") pod "d1dee261-059a-43fb-9716-92e289f4cd8f" (UID: "d1dee261-059a-43fb-9716-92e289f4cd8f"). InnerVolumeSpecName "kube-api-access-6sfgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.699537 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2" (OuterVolumeSpecName: "kube-api-access-kdxd2") pod "8f0eba45-0a26-4650-9878-4b59152d2fbb" (UID: "8f0eba45-0a26-4650-9878-4b59152d2fbb"). InnerVolumeSpecName "kube-api-access-kdxd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.718582 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1dee261-059a-43fb-9716-92e289f4cd8f" (UID: "d1dee261-059a-43fb-9716-92e289f4cd8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.728470 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data" (OuterVolumeSpecName: "config-data") pod "d1dee261-059a-43fb-9716-92e289f4cd8f" (UID: "d1dee261-059a-43fb-9716-92e289f4cd8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.742653 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f0eba45-0a26-4650-9878-4b59152d2fbb" (UID: "8f0eba45-0a26-4650-9878-4b59152d2fbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792395 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792441 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1dee261-059a-43fb-9716-92e289f4cd8f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792451 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792461 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sfgm\" (UniqueName: \"kubernetes.io/projected/d1dee261-059a-43fb-9716-92e289f4cd8f-kube-api-access-6sfgm\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792472 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0eba45-0a26-4650-9878-4b59152d2fbb-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792481 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdxd2\" (UniqueName: \"kubernetes.io/projected/8f0eba45-0a26-4650-9878-4b59152d2fbb-kube-api-access-kdxd2\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.792489 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1dee261-059a-43fb-9716-92e289f4cd8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.928552 4759 generic.go:334] "Generic (PLEG): container finished" podID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerID="f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72" exitCode=0 Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.928618 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerDied","Data":"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.928647 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1dee261-059a-43fb-9716-92e289f4cd8f","Type":"ContainerDied","Data":"576dafc30a2b764dd1710c77897fd39b730951ce1b00500d59d02c86539e2d6d"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.928660 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.928684 4759 scope.go:117] "RemoveContainer" containerID="f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.948986 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerStarted","Data":"5d65692d8b3e6b75c1ba3daee41e2d256733b97789e9aa9bdded5fca431b4f97"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.949030 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerStarted","Data":"f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.954626 4759 generic.go:334] "Generic (PLEG): container finished" podID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerID="41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5" exitCode=0 Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.954856 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2k6tq" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.955904 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerDied","Data":"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.955958 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2k6tq" event={"ID":"8f0eba45-0a26-4650-9878-4b59152d2fbb","Type":"ContainerDied","Data":"8175f87823494f1d6b67864ccaa76f460f11c29e1b9b302e37ecb60a040e7ca0"} Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.968948 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.977950 4759 scope.go:117] "RemoveContainer" containerID="cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f" Dec 05 00:48:54 crc kubenswrapper[4759]: I1205 00:48:54.989096 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.009271 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.010010 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="extract-utilities" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010084 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="extract-utilities" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.010166 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-api" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010230 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-api" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.010321 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-log" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010390 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-log" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.010466 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="extract-content" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010521 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="extract-content" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.010587 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="registry-server" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010650 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="registry-server" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010909 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-log" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.010991 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" containerName="registry-server" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.011055 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" containerName="nova-api-api" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.012288 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.015194 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.015430 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.015700 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.023852 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.037639 4759 scope.go:117] "RemoveContainer" containerID="f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.038692 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72\": container with ID starting with f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72 not found: ID does not exist" containerID="f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.038719 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72"} err="failed to get container status \"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72\": rpc error: code = NotFound desc = could not find container \"f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72\": container with ID starting with f78834a78d76cbb15b7579d943a7de21ca4899a3004485bf6a0ed55c21518b72 not found: ID does not exist" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.038740 4759 scope.go:117] "RemoveContainer" containerID="cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.039007 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f\": container with ID starting with cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f not found: ID does not exist" containerID="cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.039036 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f"} err="failed to get container status \"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f\": rpc error: code = NotFound desc = could not find container \"cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f\": container with ID starting with cda721744bfb3a067d03896b0dd5bfaf691ceb620c97a34c32164a693d02b39f not found: ID does not exist" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.039051 4759 scope.go:117] "RemoveContainer" containerID="41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.041447 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.057223 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2k6tq"] Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.067284 4759 scope.go:117] "RemoveContainer" containerID="585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.091295 4759 scope.go:117] "RemoveContainer" containerID="baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103230 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j2c\" (UniqueName: \"kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103349 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103467 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103599 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103782 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.103923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.158946 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.159171 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.172090 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0eba45-0a26-4650-9878-4b59152d2fbb" path="/var/lib/kubelet/pods/8f0eba45-0a26-4650-9878-4b59152d2fbb/volumes" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.173124 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1dee261-059a-43fb-9716-92e289f4cd8f" path="/var/lib/kubelet/pods/d1dee261-059a-43fb-9716-92e289f4cd8f/volumes" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.196574 4759 scope.go:117] "RemoveContainer" containerID="41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.197047 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5\": container with ID starting with 41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5 not found: ID does not exist" containerID="41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.197083 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5"} err="failed to get container status \"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5\": rpc error: code = NotFound desc = could not find container \"41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5\": container with ID starting with 41beb5167c33af51b6d415a8703234e89103a387b23f8d49b7246cc1cc2b76a5 not found: ID does not exist" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.197106 4759 scope.go:117] "RemoveContainer" containerID="585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.197582 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38\": container with ID starting with 585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38 not found: ID does not exist" containerID="585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.197647 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38"} err="failed to get container status \"585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38\": rpc error: code = NotFound desc = could not find container \"585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38\": container with ID starting with 585c371c48a9271287181ee5419ba68933136c9e4dcb197efde86c9b889c1b38 not found: ID does not exist" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.197675 4759 scope.go:117] "RemoveContainer" containerID="baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb" Dec 05 00:48:55 crc kubenswrapper[4759]: E1205 00:48:55.198145 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb\": container with ID starting with baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb not found: ID does not exist" containerID="baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.198192 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb"} err="failed to get container status \"baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb\": rpc error: code = NotFound desc = could not find container \"baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb\": container with ID starting with baacebeea6206fcdc4914d477d7d24891a84a945053b0932ab5321135a56c8eb not found: ID does not exist" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205102 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j2c\" (UniqueName: \"kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205160 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205196 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205236 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205268 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205306 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.205804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.216901 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.217039 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.217060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.217518 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.222234 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j2c\" (UniqueName: \"kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c\") pod \"nova-api-0\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.347770 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.906826 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.994259 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerStarted","Data":"324b888f605a24c9effa764025721279a8154a57d044279ff27d66542fcae63c"} Dec 05 00:48:55 crc kubenswrapper[4759]: I1205 00:48:55.994298 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerStarted","Data":"6698bd51a4d64dbab26494920e39e1b35b726ac23780c08c197f0392200f84d2"} Dec 05 00:48:56 crc kubenswrapper[4759]: I1205 00:48:56.000192 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerStarted","Data":"7279e3fffe0742d85b94cf4a1eb82f88d65077102f3b541033dbccf3861f041e"} Dec 05 00:48:56 crc kubenswrapper[4759]: I1205 00:48:56.329630 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:56 crc kubenswrapper[4759]: I1205 00:48:56.361243 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.132140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerStarted","Data":"167eeabd6555cc4dedcecf2f4134188ff35d2fb9a8a1b04b7b689ab8bc158bad"} Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.132208 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerStarted","Data":"6ce6e93417dddb90ad76471c046a7b77ed169ac491e7d3518b383623ef082562"} Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.176740 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.180755 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.180737296 podStartE2EDuration="3.180737296s" podCreationTimestamp="2025-12-05 00:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:57.16776701 +0000 UTC m=+1556.383427960" watchObservedRunningTime="2025-12-05 00:48:57.180737296 +0000 UTC m=+1556.396398246" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.219667 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.219719 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.339007 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-cnpjh"] Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.340557 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.345741 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.345939 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.370915 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cnpjh"] Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.535657 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.535709 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj77r\" (UniqueName: \"kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.535771 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.535985 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.637521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.637584 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.637727 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.637761 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj77r\" (UniqueName: \"kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.641639 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.642223 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.642471 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.660749 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj77r\" (UniqueName: \"kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r\") pod \"nova-cell1-cell-mapping-cnpjh\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:57 crc kubenswrapper[4759]: I1205 00:48:57.683765 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.140599 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerStarted","Data":"143f6072234304eba121d335063837cc15dea13c3c22abd9cfda90d478fa9e19"} Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.164739 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8296112 podStartE2EDuration="6.164722796s" podCreationTimestamp="2025-12-05 00:48:52 +0000 UTC" firstStartedPulling="2025-12-05 00:48:53.898672516 +0000 UTC m=+1553.114333476" lastFinishedPulling="2025-12-05 00:48:57.233784122 +0000 UTC m=+1556.449445072" observedRunningTime="2025-12-05 00:48:58.161195219 +0000 UTC m=+1557.376856169" watchObservedRunningTime="2025-12-05 00:48:58.164722796 +0000 UTC m=+1557.380383746" Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.205949 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cnpjh"] Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.263489 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.263594 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.440645 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.515461 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:48:58 crc kubenswrapper[4759]: I1205 00:48:58.515678 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="dnsmasq-dns" containerID="cri-o://a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878" gracePeriod=10 Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.131870 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.178110 4759 generic.go:334] "Generic (PLEG): container finished" podID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerID="a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878" exitCode=0 Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.178792 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181031 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181062 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cnpjh" event={"ID":"d77c176e-03d5-4b5f-908f-a95c826fea16","Type":"ContainerStarted","Data":"89604e724134de4f98f0226eb9699c41ddc6d675cd76437609c5690f082425fa"} Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181081 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cnpjh" event={"ID":"d77c176e-03d5-4b5f-908f-a95c826fea16","Type":"ContainerStarted","Data":"352137b706d3a3620b528d559203b245ba1e5707bb360a5d9abdf0a35b69a5b2"} Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181094 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" event={"ID":"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b","Type":"ContainerDied","Data":"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878"} Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181108 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" event={"ID":"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b","Type":"ContainerDied","Data":"3c8a46c221aeab56f69f8614dd3e8fd47ca4219fbe767420cabca4978244a590"} Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.181128 4759 scope.go:117] "RemoveContainer" containerID="a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.220741 4759 scope.go:117] "RemoveContainer" containerID="64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.249911 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-cnpjh" podStartSLOduration=2.249872856 podStartE2EDuration="2.249872856s" podCreationTimestamp="2025-12-05 00:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:48:59.194403571 +0000 UTC m=+1558.410064521" watchObservedRunningTime="2025-12-05 00:48:59.249872856 +0000 UTC m=+1558.465533806" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.276687 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.277851 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.278167 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgq5z\" (UniqueName: \"kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.278217 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.278245 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.278329 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc\") pod \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\" (UID: \"8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b\") " Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.289585 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z" (OuterVolumeSpecName: "kube-api-access-kgq5z") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "kube-api-access-kgq5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.294583 4759 scope.go:117] "RemoveContainer" containerID="a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878" Dec 05 00:48:59 crc kubenswrapper[4759]: E1205 00:48:59.298549 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878\": container with ID starting with a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878 not found: ID does not exist" containerID="a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.298594 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878"} err="failed to get container status \"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878\": rpc error: code = NotFound desc = could not find container \"a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878\": container with ID starting with a7cd8d62a4bdc8513d10bd32382ee2794ea72594aa10390635b2dfb039dcc878 not found: ID does not exist" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.298619 4759 scope.go:117] "RemoveContainer" containerID="64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8" Dec 05 00:48:59 crc kubenswrapper[4759]: E1205 00:48:59.299013 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8\": container with ID starting with 64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8 not found: ID does not exist" containerID="64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.299040 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8"} err="failed to get container status \"64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8\": rpc error: code = NotFound desc = could not find container \"64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8\": container with ID starting with 64f9cd8f6f0d0b1fc5eca5d05767b4e2a93148bf1e0b3dcafa1e6a809dcaf0a8 not found: ID does not exist" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.345485 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.346045 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.352748 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.355089 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config" (OuterVolumeSpecName: "config") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.375239 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" (UID: "8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.382945 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.382978 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.382989 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.383003 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgq5z\" (UniqueName: \"kubernetes.io/projected/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-kube-api-access-kgq5z\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.383013 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.383022 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.516806 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:48:59 crc kubenswrapper[4759]: I1205 00:48:59.525660 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-b5m5r"] Dec 05 00:49:01 crc kubenswrapper[4759]: I1205 00:49:01.173453 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" path="/var/lib/kubelet/pods/8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b/volumes" Dec 05 00:49:03 crc kubenswrapper[4759]: I1205 00:49:03.818639 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7877d89589-b5m5r" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.223:5353: i/o timeout" Dec 05 00:49:04 crc kubenswrapper[4759]: I1205 00:49:04.258335 4759 generic.go:334] "Generic (PLEG): container finished" podID="d77c176e-03d5-4b5f-908f-a95c826fea16" containerID="89604e724134de4f98f0226eb9699c41ddc6d675cd76437609c5690f082425fa" exitCode=0 Dec 05 00:49:04 crc kubenswrapper[4759]: I1205 00:49:04.258399 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cnpjh" event={"ID":"d77c176e-03d5-4b5f-908f-a95c826fea16","Type":"ContainerDied","Data":"89604e724134de4f98f0226eb9699c41ddc6d675cd76437609c5690f082425fa"} Dec 05 00:49:05 crc kubenswrapper[4759]: I1205 00:49:05.348542 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:49:05 crc kubenswrapper[4759]: I1205 00:49:05.349112 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.696021 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:05.697474 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="dnsmasq-dns" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.697492 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="dnsmasq-dns" Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:05.697512 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="init" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.697519 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="init" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.697901 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee67739-a1ad-4b2a-abc1-2626ef3a6a5b" containerName="dnsmasq-dns" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.700066 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.714726 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.853471 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.857889 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkb48\" (UniqueName: \"kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.858031 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.858090 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.959474 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts\") pod \"d77c176e-03d5-4b5f-908f-a95c826fea16\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.959582 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj77r\" (UniqueName: \"kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r\") pod \"d77c176e-03d5-4b5f-908f-a95c826fea16\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.959618 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle\") pod \"d77c176e-03d5-4b5f-908f-a95c826fea16\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.959646 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data\") pod \"d77c176e-03d5-4b5f-908f-a95c826fea16\" (UID: \"d77c176e-03d5-4b5f-908f-a95c826fea16\") " Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.960097 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.960144 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.960229 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkb48\" (UniqueName: \"kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.960925 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.961349 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.966905 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r" (OuterVolumeSpecName: "kube-api-access-dj77r") pod "d77c176e-03d5-4b5f-908f-a95c826fea16" (UID: "d77c176e-03d5-4b5f-908f-a95c826fea16"). InnerVolumeSpecName "kube-api-access-dj77r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:05.984219 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkb48\" (UniqueName: \"kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48\") pod \"community-operators-h49b8\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.009173 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data" (OuterVolumeSpecName: "config-data") pod "d77c176e-03d5-4b5f-908f-a95c826fea16" (UID: "d77c176e-03d5-4b5f-908f-a95c826fea16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.014214 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d77c176e-03d5-4b5f-908f-a95c826fea16" (UID: "d77c176e-03d5-4b5f-908f-a95c826fea16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.015355 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts" (OuterVolumeSpecName: "scripts") pod "d77c176e-03d5-4b5f-908f-a95c826fea16" (UID: "d77c176e-03d5-4b5f-908f-a95c826fea16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.039868 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.062842 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.062873 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.062884 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77c176e-03d5-4b5f-908f-a95c826fea16-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.062896 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj77r\" (UniqueName: \"kubernetes.io/projected/d77c176e-03d5-4b5f-908f-a95c826fea16-kube-api-access-dj77r\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.289922 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cnpjh" event={"ID":"d77c176e-03d5-4b5f-908f-a95c826fea16","Type":"ContainerDied","Data":"352137b706d3a3620b528d559203b245ba1e5707bb360a5d9abdf0a35b69a5b2"} Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.290281 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="352137b706d3a3620b528d559203b245ba1e5707bb360a5d9abdf0a35b69a5b2" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.290045 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cnpjh" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.363441 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.237:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.363449 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.237:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.467032 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.467340 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-log" containerID="cri-o://6ce6e93417dddb90ad76471c046a7b77ed169ac491e7d3518b383623ef082562" gracePeriod=30 Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.467811 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-api" containerID="cri-o://167eeabd6555cc4dedcecf2f4134188ff35d2fb9a8a1b04b7b689ab8bc158bad" gracePeriod=30 Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.493459 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.493709 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerName="nova-scheduler-scheduler" containerID="cri-o://9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" gracePeriod=30 Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.511836 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.512146 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-log" containerID="cri-o://345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef" gracePeriod=30 Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.512273 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-metadata" containerID="cri-o://77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d" gracePeriod=30 Dec 05 00:49:06 crc kubenswrapper[4759]: W1205 00:49:06.594704 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc62aef30_b249_48f4_944a_094a692b4101.slice/crio-ae95341db8a35fa1519112eb2baaf5db925f11070c1ade36754a0ab193838797 WatchSource:0}: Error finding container ae95341db8a35fa1519112eb2baaf5db925f11070c1ade36754a0ab193838797: Status 404 returned error can't find the container with id ae95341db8a35fa1519112eb2baaf5db925f11070c1ade36754a0ab193838797 Dec 05 00:49:06 crc kubenswrapper[4759]: I1205 00:49:06.603465 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:06.761787 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:06.763760 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:06.765672 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 00:49:06 crc kubenswrapper[4759]: E1205 00:49:06.765733 4759 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerName="nova-scheduler-scheduler" Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.306454 4759 generic.go:334] "Generic (PLEG): container finished" podID="d59eae83-132a-49a1-bcc8-997553b13d82" containerID="345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef" exitCode=143 Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.306527 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerDied","Data":"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef"} Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.308646 4759 generic.go:334] "Generic (PLEG): container finished" podID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerID="6ce6e93417dddb90ad76471c046a7b77ed169ac491e7d3518b383623ef082562" exitCode=143 Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.308717 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerDied","Data":"6ce6e93417dddb90ad76471c046a7b77ed169ac491e7d3518b383623ef082562"} Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.313932 4759 generic.go:334] "Generic (PLEG): container finished" podID="c62aef30-b249-48f4-944a-094a692b4101" containerID="511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26" exitCode=0 Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.314052 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerDied","Data":"511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26"} Dec 05 00:49:07 crc kubenswrapper[4759]: I1205 00:49:07.314725 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerStarted","Data":"ae95341db8a35fa1519112eb2baaf5db925f11070c1ade36754a0ab193838797"} Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.332127 4759 generic.go:334] "Generic (PLEG): container finished" podID="de04d4a9-3563-4423-8185-089c35559589" containerID="f8fe1cc148e8ef06846e04effd3c59424ed1a467d4af272abd79ffe3f2aed485" exitCode=137 Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.332335 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerDied","Data":"f8fe1cc148e8ef06846e04effd3c59424ed1a467d4af272abd79ffe3f2aed485"} Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.482917 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.523245 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts\") pod \"de04d4a9-3563-4423-8185-089c35559589\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.523779 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5q8g\" (UniqueName: \"kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g\") pod \"de04d4a9-3563-4423-8185-089c35559589\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.524586 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle\") pod \"de04d4a9-3563-4423-8185-089c35559589\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.525566 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data\") pod \"de04d4a9-3563-4423-8185-089c35559589\" (UID: \"de04d4a9-3563-4423-8185-089c35559589\") " Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.535047 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts" (OuterVolumeSpecName: "scripts") pod "de04d4a9-3563-4423-8185-089c35559589" (UID: "de04d4a9-3563-4423-8185-089c35559589"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.535195 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g" (OuterVolumeSpecName: "kube-api-access-g5q8g") pod "de04d4a9-3563-4423-8185-089c35559589" (UID: "de04d4a9-3563-4423-8185-089c35559589"). InnerVolumeSpecName "kube-api-access-g5q8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.628833 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.629108 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5q8g\" (UniqueName: \"kubernetes.io/projected/de04d4a9-3563-4423-8185-089c35559589-kube-api-access-g5q8g\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.678129 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data" (OuterVolumeSpecName: "config-data") pod "de04d4a9-3563-4423-8185-089c35559589" (UID: "de04d4a9-3563-4423-8185-089c35559589"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.698816 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de04d4a9-3563-4423-8185-089c35559589" (UID: "de04d4a9-3563-4423-8185-089c35559589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.730522 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:08 crc kubenswrapper[4759]: I1205 00:49:08.730549 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de04d4a9-3563-4423-8185-089c35559589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.156432 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.156756 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.346444 4759 generic.go:334] "Generic (PLEG): container finished" podID="c62aef30-b249-48f4-944a-094a692b4101" containerID="4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83" exitCode=0 Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.346563 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerDied","Data":"4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83"} Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.352714 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"de04d4a9-3563-4423-8185-089c35559589","Type":"ContainerDied","Data":"219221399cfc497fcfb435bd965c51571346b558a343555eaed6b52c1be3739a"} Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.352772 4759 scope.go:117] "RemoveContainer" containerID="f8fe1cc148e8ef06846e04effd3c59424ed1a467d4af272abd79ffe3f2aed485" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.352953 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.398506 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.409487 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.434385 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.435195 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-evaluator" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435216 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-evaluator" Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.435277 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-notifier" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435288 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-notifier" Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.435315 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77c176e-03d5-4b5f-908f-a95c826fea16" containerName="nova-manage" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435322 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77c176e-03d5-4b5f-908f-a95c826fea16" containerName="nova-manage" Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.435345 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-listener" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435354 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-listener" Dec 05 00:49:09 crc kubenswrapper[4759]: E1205 00:49:09.435368 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-api" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435375 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-api" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435782 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-listener" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435807 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-api" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435816 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77c176e-03d5-4b5f-908f-a95c826fea16" containerName="nova-manage" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.435833 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-notifier" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.438011 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="de04d4a9-3563-4423-8185-089c35559589" containerName="aodh-evaluator" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.444798 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.452855 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.453138 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mr6x2" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.453418 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.453580 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.455449 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.470864 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548367 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548417 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml82s\" (UniqueName: \"kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548467 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548483 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548543 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.548610 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.561416 4759 scope.go:117] "RemoveContainer" containerID="335609a57ad8d4769cf24f6194a490594ac72807791bc83339c83d385c6fdbac" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.602833 4759 scope.go:117] "RemoveContainer" containerID="d399944d0148d45787a61fd37d64736aaa479bd239455599558cad800f47ee8d" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649599 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649646 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml82s\" (UniqueName: \"kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649689 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649706 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649770 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.649856 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.653951 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.655250 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.655670 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.656893 4759 scope.go:117] "RemoveContainer" containerID="df8a12e8047b8668c5d9a7627e03718bf1d339e6c26c3b2f657b8988b07bb38f" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.657494 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.662332 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.675298 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml82s\" (UniqueName: \"kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s\") pod \"aodh-0\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " pod="openstack/aodh-0" Dec 05 00:49:09 crc kubenswrapper[4759]: I1205 00:49:09.840757 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.313053 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.368874 4759 generic.go:334] "Generic (PLEG): container finished" podID="d59eae83-132a-49a1-bcc8-997553b13d82" containerID="77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d" exitCode=0 Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.368989 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerDied","Data":"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d"} Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.369022 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d59eae83-132a-49a1-bcc8-997553b13d82","Type":"ContainerDied","Data":"1d14db1bd89ccbd976ebfe41aa641faaabbfa74982d9b69f8ae217872e15f2b9"} Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.369043 4759 scope.go:117] "RemoveContainer" containerID="77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.369260 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.376488 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szvl\" (UniqueName: \"kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl\") pod \"d59eae83-132a-49a1-bcc8-997553b13d82\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.376566 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data\") pod \"d59eae83-132a-49a1-bcc8-997553b13d82\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.376630 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle\") pod \"d59eae83-132a-49a1-bcc8-997553b13d82\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.376691 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs\") pod \"d59eae83-132a-49a1-bcc8-997553b13d82\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.376799 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs\") pod \"d59eae83-132a-49a1-bcc8-997553b13d82\" (UID: \"d59eae83-132a-49a1-bcc8-997553b13d82\") " Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.378790 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs" (OuterVolumeSpecName: "logs") pod "d59eae83-132a-49a1-bcc8-997553b13d82" (UID: "d59eae83-132a-49a1-bcc8-997553b13d82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.381545 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl" (OuterVolumeSpecName: "kube-api-access-9szvl") pod "d59eae83-132a-49a1-bcc8-997553b13d82" (UID: "d59eae83-132a-49a1-bcc8-997553b13d82"). InnerVolumeSpecName "kube-api-access-9szvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.429670 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data" (OuterVolumeSpecName: "config-data") pod "d59eae83-132a-49a1-bcc8-997553b13d82" (UID: "d59eae83-132a-49a1-bcc8-997553b13d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.434629 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d59eae83-132a-49a1-bcc8-997553b13d82" (UID: "d59eae83-132a-49a1-bcc8-997553b13d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.478652 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szvl\" (UniqueName: \"kubernetes.io/projected/d59eae83-132a-49a1-bcc8-997553b13d82-kube-api-access-9szvl\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.478681 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.478690 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.478700 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d59eae83-132a-49a1-bcc8-997553b13d82-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.480070 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d59eae83-132a-49a1-bcc8-997553b13d82" (UID: "d59eae83-132a-49a1-bcc8-997553b13d82"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.521268 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.581867 4759 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59eae83-132a-49a1-bcc8-997553b13d82-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.591750 4759 scope.go:117] "RemoveContainer" containerID="345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.616872 4759 scope.go:117] "RemoveContainer" containerID="77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d" Dec 05 00:49:10 crc kubenswrapper[4759]: E1205 00:49:10.617301 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d\": container with ID starting with 77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d not found: ID does not exist" containerID="77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.617350 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d"} err="failed to get container status \"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d\": rpc error: code = NotFound desc = could not find container \"77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d\": container with ID starting with 77e25ba1a5454ad64a1e3d6b4374ccda35e3bb28b594b95433c7b65e2cc3b85d not found: ID does not exist" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.617375 4759 scope.go:117] "RemoveContainer" containerID="345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef" Dec 05 00:49:10 crc kubenswrapper[4759]: E1205 00:49:10.617700 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef\": container with ID starting with 345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef not found: ID does not exist" containerID="345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.617732 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef"} err="failed to get container status \"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef\": rpc error: code = NotFound desc = could not find container \"345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef\": container with ID starting with 345e6d1dcc738b6f97f2bd319b42b5195246eb2c96dccc7489e6c45c253ef3ef not found: ID does not exist" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.723958 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.737954 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.753722 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:10 crc kubenswrapper[4759]: E1205 00:49:10.754153 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-log" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.754170 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-log" Dec 05 00:49:10 crc kubenswrapper[4759]: E1205 00:49:10.754199 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-metadata" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.754206 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-metadata" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.754414 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-log" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.754438 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" containerName="nova-metadata-metadata" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.755464 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.757709 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.763347 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.785767 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbn87\" (UniqueName: \"kubernetes.io/projected/372a0f97-53ca-477d-9202-5616650e4192-kube-api-access-jbn87\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.785864 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.785920 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-config-data\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.785942 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372a0f97-53ca-477d-9202-5616650e4192-logs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.786033 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.791974 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.887727 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbn87\" (UniqueName: \"kubernetes.io/projected/372a0f97-53ca-477d-9202-5616650e4192-kube-api-access-jbn87\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.887864 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.887946 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-config-data\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.887975 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372a0f97-53ca-477d-9202-5616650e4192-logs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.888125 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.889023 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372a0f97-53ca-477d-9202-5616650e4192-logs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.895068 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-config-data\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.903699 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.903712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/372a0f97-53ca-477d-9202-5616650e4192-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:10 crc kubenswrapper[4759]: I1205 00:49:10.906442 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbn87\" (UniqueName: \"kubernetes.io/projected/372a0f97-53ca-477d-9202-5616650e4192-kube-api-access-jbn87\") pod \"nova-metadata-0\" (UID: \"372a0f97-53ca-477d-9202-5616650e4192\") " pod="openstack/nova-metadata-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.074348 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.178358 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59eae83-132a-49a1-bcc8-997553b13d82" path="/var/lib/kubelet/pods/d59eae83-132a-49a1-bcc8-997553b13d82/volumes" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.179638 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de04d4a9-3563-4423-8185-089c35559589" path="/var/lib/kubelet/pods/de04d4a9-3563-4423-8185-089c35559589/volumes" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.326721 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.387477 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerStarted","Data":"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1"} Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.387766 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerStarted","Data":"09aad32a0d578c0a4029c2cbfaea9488314152a21f06ece3396f07f840fd2d78"} Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.389097 4759 generic.go:334] "Generic (PLEG): container finished" podID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" exitCode=0 Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.389161 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.389171 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aeed76aa-514a-433f-bce2-5d97d35e8534","Type":"ContainerDied","Data":"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113"} Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.389292 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aeed76aa-514a-433f-bce2-5d97d35e8534","Type":"ContainerDied","Data":"99cd2946562b108777f8910239fafe8b80d1d6763ed1a91912e81ec145b34f7a"} Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.389349 4759 scope.go:117] "RemoveContainer" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.398932 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerStarted","Data":"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa"} Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.445368 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h49b8" podStartSLOduration=3.451020903 podStartE2EDuration="6.445348336s" podCreationTimestamp="2025-12-05 00:49:05 +0000 UTC" firstStartedPulling="2025-12-05 00:49:07.31743191 +0000 UTC m=+1566.533092860" lastFinishedPulling="2025-12-05 00:49:10.311759343 +0000 UTC m=+1569.527420293" observedRunningTime="2025-12-05 00:49:11.430416001 +0000 UTC m=+1570.646076971" watchObservedRunningTime="2025-12-05 00:49:11.445348336 +0000 UTC m=+1570.661009286" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.488202 4759 scope.go:117] "RemoveContainer" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" Dec 05 00:49:11 crc kubenswrapper[4759]: E1205 00:49:11.491147 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113\": container with ID starting with 9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113 not found: ID does not exist" containerID="9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.491194 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113"} err="failed to get container status \"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113\": rpc error: code = NotFound desc = could not find container \"9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113\": container with ID starting with 9b08c05c3a42fa2faed1edecc8941b4ebc138a633d24249d4917857a34bf2113 not found: ID does not exist" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.509018 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle\") pod \"aeed76aa-514a-433f-bce2-5d97d35e8534\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.509119 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxj4\" (UniqueName: \"kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4\") pod \"aeed76aa-514a-433f-bce2-5d97d35e8534\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.509246 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data\") pod \"aeed76aa-514a-433f-bce2-5d97d35e8534\" (UID: \"aeed76aa-514a-433f-bce2-5d97d35e8534\") " Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.518461 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4" (OuterVolumeSpecName: "kube-api-access-xnxj4") pod "aeed76aa-514a-433f-bce2-5d97d35e8534" (UID: "aeed76aa-514a-433f-bce2-5d97d35e8534"). InnerVolumeSpecName "kube-api-access-xnxj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.540229 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeed76aa-514a-433f-bce2-5d97d35e8534" (UID: "aeed76aa-514a-433f-bce2-5d97d35e8534"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.541444 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data" (OuterVolumeSpecName: "config-data") pod "aeed76aa-514a-433f-bce2-5d97d35e8534" (UID: "aeed76aa-514a-433f-bce2-5d97d35e8534"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.612931 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnxj4\" (UniqueName: \"kubernetes.io/projected/aeed76aa-514a-433f-bce2-5d97d35e8534-kube-api-access-xnxj4\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.612963 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.612974 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeed76aa-514a-433f-bce2-5d97d35e8534-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.614742 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 00:49:11 crc kubenswrapper[4759]: W1205 00:49:11.619242 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372a0f97_53ca_477d_9202_5616650e4192.slice/crio-2fcd53af7039c2af32c86932a5a3aa2c32e1f4bff7d0ce6bcdef79eb403c469c WatchSource:0}: Error finding container 2fcd53af7039c2af32c86932a5a3aa2c32e1f4bff7d0ce6bcdef79eb403c469c: Status 404 returned error can't find the container with id 2fcd53af7039c2af32c86932a5a3aa2c32e1f4bff7d0ce6bcdef79eb403c469c Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.745556 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.769572 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.799161 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:11 crc kubenswrapper[4759]: E1205 00:49:11.799753 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerName="nova-scheduler-scheduler" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.799777 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerName="nova-scheduler-scheduler" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.800048 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" containerName="nova-scheduler-scheduler" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.801193 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.803690 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.815434 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.824516 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkb76\" (UniqueName: \"kubernetes.io/projected/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-kube-api-access-jkb76\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.824751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.824800 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-config-data\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.927436 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.927485 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-config-data\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.927598 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkb76\" (UniqueName: \"kubernetes.io/projected/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-kube-api-access-jkb76\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.932992 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.933322 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-config-data\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:11 crc kubenswrapper[4759]: I1205 00:49:11.946988 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkb76\" (UniqueName: \"kubernetes.io/projected/f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9-kube-api-access-jkb76\") pod \"nova-scheduler-0\" (UID: \"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9\") " pod="openstack/nova-scheduler-0" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.129655 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.421934 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"372a0f97-53ca-477d-9202-5616650e4192","Type":"ContainerStarted","Data":"56464e44a792fda162d35e43f5347802f5726b0464350619bbb4f64940222be7"} Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.422401 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"372a0f97-53ca-477d-9202-5616650e4192","Type":"ContainerStarted","Data":"fc4bf6d5bfbb989e3522610d9b04015b0cf3347558a7014bcd235a089b2bdf46"} Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.422411 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"372a0f97-53ca-477d-9202-5616650e4192","Type":"ContainerStarted","Data":"2fcd53af7039c2af32c86932a5a3aa2c32e1f4bff7d0ce6bcdef79eb403c469c"} Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.445633 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerStarted","Data":"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135"} Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.447416 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.447397537 podStartE2EDuration="2.447397537s" podCreationTimestamp="2025-12-05 00:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:49:12.436409108 +0000 UTC m=+1571.652070058" watchObservedRunningTime="2025-12-05 00:49:12.447397537 +0000 UTC m=+1571.663058487" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.449612 4759 generic.go:334] "Generic (PLEG): container finished" podID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerID="167eeabd6555cc4dedcecf2f4134188ff35d2fb9a8a1b04b7b689ab8bc158bad" exitCode=0 Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.449660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerDied","Data":"167eeabd6555cc4dedcecf2f4134188ff35d2fb9a8a1b04b7b689ab8bc158bad"} Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.545886 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.687491 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.744716 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54j2c\" (UniqueName: \"kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.744778 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.744866 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.744892 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.744984 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.745105 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs\") pod \"59181eac-c082-4e0e-b3d2-24fc49566a4b\" (UID: \"59181eac-c082-4e0e-b3d2-24fc49566a4b\") " Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.745508 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs" (OuterVolumeSpecName: "logs") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.745746 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59181eac-c082-4e0e-b3d2-24fc49566a4b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.750351 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c" (OuterVolumeSpecName: "kube-api-access-54j2c") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "kube-api-access-54j2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.776054 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.786481 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data" (OuterVolumeSpecName: "config-data") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.822365 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.849352 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54j2c\" (UniqueName: \"kubernetes.io/projected/59181eac-c082-4e0e-b3d2-24fc49566a4b-kube-api-access-54j2c\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.849389 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.849402 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.849416 4759 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.857681 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59181eac-c082-4e0e-b3d2-24fc49566a4b" (UID: "59181eac-c082-4e0e-b3d2-24fc49566a4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:12 crc kubenswrapper[4759]: I1205 00:49:12.951795 4759 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59181eac-c082-4e0e-b3d2-24fc49566a4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.168141 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeed76aa-514a-433f-bce2-5d97d35e8534" path="/var/lib/kubelet/pods/aeed76aa-514a-433f-bce2-5d97d35e8534/volumes" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.471788 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59181eac-c082-4e0e-b3d2-24fc49566a4b","Type":"ContainerDied","Data":"7279e3fffe0742d85b94cf4a1eb82f88d65077102f3b541033dbccf3861f041e"} Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.472075 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.472249 4759 scope.go:117] "RemoveContainer" containerID="167eeabd6555cc4dedcecf2f4134188ff35d2fb9a8a1b04b7b689ab8bc158bad" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.485002 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerStarted","Data":"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a"} Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.485079 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerStarted","Data":"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87"} Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.489413 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9","Type":"ContainerStarted","Data":"0fb8f2f2f86ed4822b10e099849463d1b573fd049467c6fe0503120895f98d30"} Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.489469 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9","Type":"ContainerStarted","Data":"3e9be40208fafbbd569f502476b5363684cf47ecfcc9cf6c35467f14b0ccc64d"} Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.506751 4759 scope.go:117] "RemoveContainer" containerID="6ce6e93417dddb90ad76471c046a7b77ed169ac491e7d3518b383623ef082562" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.509126 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.544566 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.572152 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.012072915 podStartE2EDuration="4.572133603s" podCreationTimestamp="2025-12-05 00:49:09 +0000 UTC" firstStartedPulling="2025-12-05 00:49:10.521867024 +0000 UTC m=+1569.737527984" lastFinishedPulling="2025-12-05 00:49:13.081927702 +0000 UTC m=+1572.297588672" observedRunningTime="2025-12-05 00:49:13.522023529 +0000 UTC m=+1572.737684489" watchObservedRunningTime="2025-12-05 00:49:13.572133603 +0000 UTC m=+1572.787794543" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.641876 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:13 crc kubenswrapper[4759]: E1205 00:49:13.642444 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-log" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.642474 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-log" Dec 05 00:49:13 crc kubenswrapper[4759]: E1205 00:49:13.642530 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-api" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.642541 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-api" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.642761 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-log" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.642809 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" containerName="nova-api-api" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.643966 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.647298 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.647615 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.649198 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.651607 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.661993 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.661970997 podStartE2EDuration="2.661970997s" podCreationTimestamp="2025-12-05 00:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:49:13.553687743 +0000 UTC m=+1572.769348693" watchObservedRunningTime="2025-12-05 00:49:13.661970997 +0000 UTC m=+1572.877631947" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671048 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671089 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnzl6\" (UniqueName: \"kubernetes.io/projected/a886f187-7e44-44b0-8dd6-030df520def9-kube-api-access-xnzl6\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671144 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671171 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671193 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-config-data\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.671250 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a886f187-7e44-44b0-8dd6-030df520def9-logs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773217 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773275 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnzl6\" (UniqueName: \"kubernetes.io/projected/a886f187-7e44-44b0-8dd6-030df520def9-kube-api-access-xnzl6\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773402 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773453 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773496 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-config-data\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.773605 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a886f187-7e44-44b0-8dd6-030df520def9-logs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.774146 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a886f187-7e44-44b0-8dd6-030df520def9-logs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.780211 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.780826 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.795598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.798059 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnzl6\" (UniqueName: \"kubernetes.io/projected/a886f187-7e44-44b0-8dd6-030df520def9-kube-api-access-xnzl6\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.801982 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a886f187-7e44-44b0-8dd6-030df520def9-config-data\") pod \"nova-api-0\" (UID: \"a886f187-7e44-44b0-8dd6-030df520def9\") " pod="openstack/nova-api-0" Dec 05 00:49:13 crc kubenswrapper[4759]: I1205 00:49:13.964394 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 00:49:14 crc kubenswrapper[4759]: W1205 00:49:14.452483 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda886f187_7e44_44b0_8dd6_030df520def9.slice/crio-c735a6fe04afd984fa18e635197a343f2b4f8f7aa79b6f6d5e31e2f0ee078e30 WatchSource:0}: Error finding container c735a6fe04afd984fa18e635197a343f2b4f8f7aa79b6f6d5e31e2f0ee078e30: Status 404 returned error can't find the container with id c735a6fe04afd984fa18e635197a343f2b4f8f7aa79b6f6d5e31e2f0ee078e30 Dec 05 00:49:14 crc kubenswrapper[4759]: I1205 00:49:14.453587 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 00:49:14 crc kubenswrapper[4759]: I1205 00:49:14.525642 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a886f187-7e44-44b0-8dd6-030df520def9","Type":"ContainerStarted","Data":"c735a6fe04afd984fa18e635197a343f2b4f8f7aa79b6f6d5e31e2f0ee078e30"} Dec 05 00:49:15 crc kubenswrapper[4759]: I1205 00:49:15.170631 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59181eac-c082-4e0e-b3d2-24fc49566a4b" path="/var/lib/kubelet/pods/59181eac-c082-4e0e-b3d2-24fc49566a4b/volumes" Dec 05 00:49:15 crc kubenswrapper[4759]: I1205 00:49:15.548163 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a886f187-7e44-44b0-8dd6-030df520def9","Type":"ContainerStarted","Data":"e43b2bfd26c01afa01496c86ce69e3abb21de29aba7709aaadf79b383a97e090"} Dec 05 00:49:15 crc kubenswrapper[4759]: I1205 00:49:15.548229 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a886f187-7e44-44b0-8dd6-030df520def9","Type":"ContainerStarted","Data":"d9312de31d503c812967b9bb6dcd2120c9e12e0780fc007ea4c57fbb2c1b02b3"} Dec 05 00:49:15 crc kubenswrapper[4759]: I1205 00:49:15.580221 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5801979409999998 podStartE2EDuration="2.580197941s" podCreationTimestamp="2025-12-05 00:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:49:15.56541092 +0000 UTC m=+1574.781071870" watchObservedRunningTime="2025-12-05 00:49:15.580197941 +0000 UTC m=+1574.795858891" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.041087 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.041533 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.074883 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.074949 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.117827 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.643028 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:16 crc kubenswrapper[4759]: I1205 00:49:16.717974 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:17 crc kubenswrapper[4759]: I1205 00:49:17.131093 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 00:49:18 crc kubenswrapper[4759]: I1205 00:49:18.643196 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h49b8" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="registry-server" containerID="cri-o://317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa" gracePeriod=2 Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.223121 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.346617 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities\") pod \"c62aef30-b249-48f4-944a-094a692b4101\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.347184 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content\") pod \"c62aef30-b249-48f4-944a-094a692b4101\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.347450 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkb48\" (UniqueName: \"kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48\") pod \"c62aef30-b249-48f4-944a-094a692b4101\" (UID: \"c62aef30-b249-48f4-944a-094a692b4101\") " Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.347522 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities" (OuterVolumeSpecName: "utilities") pod "c62aef30-b249-48f4-944a-094a692b4101" (UID: "c62aef30-b249-48f4-944a-094a692b4101"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.348499 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.355841 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48" (OuterVolumeSpecName: "kube-api-access-hkb48") pod "c62aef30-b249-48f4-944a-094a692b4101" (UID: "c62aef30-b249-48f4-944a-094a692b4101"). InnerVolumeSpecName "kube-api-access-hkb48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.412171 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c62aef30-b249-48f4-944a-094a692b4101" (UID: "c62aef30-b249-48f4-944a-094a692b4101"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.451708 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkb48\" (UniqueName: \"kubernetes.io/projected/c62aef30-b249-48f4-944a-094a692b4101-kube-api-access-hkb48\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.451751 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c62aef30-b249-48f4-944a-094a692b4101-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.660526 4759 generic.go:334] "Generic (PLEG): container finished" podID="c62aef30-b249-48f4-944a-094a692b4101" containerID="317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa" exitCode=0 Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.660591 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerDied","Data":"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa"} Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.660633 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h49b8" event={"ID":"c62aef30-b249-48f4-944a-094a692b4101","Type":"ContainerDied","Data":"ae95341db8a35fa1519112eb2baaf5db925f11070c1ade36754a0ab193838797"} Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.660662 4759 scope.go:117] "RemoveContainer" containerID="317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.661088 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h49b8" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.705499 4759 scope.go:117] "RemoveContainer" containerID="4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.726655 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.744721 4759 scope.go:117] "RemoveContainer" containerID="511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.745576 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h49b8"] Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.809516 4759 scope.go:117] "RemoveContainer" containerID="317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa" Dec 05 00:49:19 crc kubenswrapper[4759]: E1205 00:49:19.810154 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa\": container with ID starting with 317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa not found: ID does not exist" containerID="317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.810204 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa"} err="failed to get container status \"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa\": rpc error: code = NotFound desc = could not find container \"317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa\": container with ID starting with 317decb98ba8bac86488f77c657c9d8a7e09b16a4871940b5745178ce8647baa not found: ID does not exist" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.810233 4759 scope.go:117] "RemoveContainer" containerID="4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83" Dec 05 00:49:19 crc kubenswrapper[4759]: E1205 00:49:19.811017 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83\": container with ID starting with 4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83 not found: ID does not exist" containerID="4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.811085 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83"} err="failed to get container status \"4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83\": rpc error: code = NotFound desc = could not find container \"4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83\": container with ID starting with 4b0f21d1f7af2bdf13101953c5bb0641f0c1512dca49e0a0b4ce9787f04d5b83 not found: ID does not exist" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.811129 4759 scope.go:117] "RemoveContainer" containerID="511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26" Dec 05 00:49:19 crc kubenswrapper[4759]: E1205 00:49:19.811620 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26\": container with ID starting with 511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26 not found: ID does not exist" containerID="511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26" Dec 05 00:49:19 crc kubenswrapper[4759]: I1205 00:49:19.811665 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26"} err="failed to get container status \"511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26\": rpc error: code = NotFound desc = could not find container \"511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26\": container with ID starting with 511d813cc570a0a9f427c5ac60dd59e18884cf21cdb41db81648208785e9aa26 not found: ID does not exist" Dec 05 00:49:21 crc kubenswrapper[4759]: I1205 00:49:21.075048 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 00:49:21 crc kubenswrapper[4759]: I1205 00:49:21.075432 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 00:49:21 crc kubenswrapper[4759]: I1205 00:49:21.181625 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62aef30-b249-48f4-944a-094a692b4101" path="/var/lib/kubelet/pods/c62aef30-b249-48f4-944a-094a692b4101/volumes" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.090490 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="372a0f97-53ca-477d-9202-5616650e4192" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.090541 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="372a0f97-53ca-477d-9202-5616650e4192" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.130356 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.156724 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:49:22 crc kubenswrapper[4759]: E1205 00:49:22.157110 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.190101 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 00:49:22 crc kubenswrapper[4759]: I1205 00:49:22.758448 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 00:49:23 crc kubenswrapper[4759]: I1205 00:49:23.578200 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 00:49:23 crc kubenswrapper[4759]: I1205 00:49:23.966002 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:49:23 crc kubenswrapper[4759]: I1205 00:49:23.966073 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 00:49:24 crc kubenswrapper[4759]: I1205 00:49:24.973902 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a886f187-7e44-44b0-8dd6-030df520def9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:24 crc kubenswrapper[4759]: I1205 00:49:24.981529 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a886f187-7e44-44b0-8dd6-030df520def9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 00:49:31 crc kubenswrapper[4759]: I1205 00:49:31.080590 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 00:49:31 crc kubenswrapper[4759]: I1205 00:49:31.082719 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 00:49:31 crc kubenswrapper[4759]: I1205 00:49:31.089878 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 00:49:31 crc kubenswrapper[4759]: I1205 00:49:31.837848 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 00:49:33 crc kubenswrapper[4759]: I1205 00:49:33.156188 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:49:33 crc kubenswrapper[4759]: E1205 00:49:33.156541 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:49:33 crc kubenswrapper[4759]: I1205 00:49:33.977985 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 00:49:33 crc kubenswrapper[4759]: I1205 00:49:33.978684 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 00:49:33 crc kubenswrapper[4759]: I1205 00:49:33.979271 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 00:49:33 crc kubenswrapper[4759]: I1205 00:49:33.986085 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 00:49:34 crc kubenswrapper[4759]: I1205 00:49:34.861028 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 00:49:34 crc kubenswrapper[4759]: I1205 00:49:34.868205 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.553717 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-xbx7b"] Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.568921 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-xbx7b"] Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.631260 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-q2dv8"] Dec 05 00:49:45 crc kubenswrapper[4759]: E1205 00:49:45.631805 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="extract-utilities" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.631830 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="extract-utilities" Dec 05 00:49:45 crc kubenswrapper[4759]: E1205 00:49:45.631854 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="extract-content" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.631863 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="extract-content" Dec 05 00:49:45 crc kubenswrapper[4759]: E1205 00:49:45.631897 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="registry-server" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.631904 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="registry-server" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.632137 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62aef30-b249-48f4-944a-094a692b4101" containerName="registry-server" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.633056 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.643672 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-q2dv8"] Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.794126 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8rm\" (UniqueName: \"kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.794490 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.794627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.899992 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8rm\" (UniqueName: \"kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.900134 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.900185 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.906808 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.915857 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.930103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8rm\" (UniqueName: \"kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm\") pod \"heat-db-sync-q2dv8\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:45 crc kubenswrapper[4759]: I1205 00:49:45.952721 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q2dv8" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.347384 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.350031 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.362279 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.475969 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-q2dv8"] Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.511512 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.511580 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.511615 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xct24\" (UniqueName: \"kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.613318 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.613387 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.613428 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xct24\" (UniqueName: \"kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.614190 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.614470 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.646227 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xct24\" (UniqueName: \"kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24\") pod \"redhat-marketplace-l7pzq\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:46 crc kubenswrapper[4759]: I1205 00:49:46.693953 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:47 crc kubenswrapper[4759]: I1205 00:49:47.195589 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed73e23b-4161-4968-93d0-aaabce1aa4bb" path="/var/lib/kubelet/pods/ed73e23b-4161-4968-93d0-aaabce1aa4bb/volumes" Dec 05 00:49:47 crc kubenswrapper[4759]: I1205 00:49:47.196449 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q2dv8" event={"ID":"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd","Type":"ContainerStarted","Data":"60c25533e1859fc94306417b1e465b30b61f7580436684917c956ea2850af615"} Dec 05 00:49:47 crc kubenswrapper[4759]: I1205 00:49:47.231106 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:49:47 crc kubenswrapper[4759]: W1205 00:49:47.238232 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dfe817_0ae0_4895_9b5d_ad80ae8b102b.slice/crio-64bdfc59d8c7a866c7ceef3d28a625a82ba0a986f056c1c7f4fb7fbc4dd6d524 WatchSource:0}: Error finding container 64bdfc59d8c7a866c7ceef3d28a625a82ba0a986f056c1c7f4fb7fbc4dd6d524: Status 404 returned error can't find the container with id 64bdfc59d8c7a866c7ceef3d28a625a82ba0a986f056c1c7f4fb7fbc4dd6d524 Dec 05 00:49:47 crc kubenswrapper[4759]: I1205 00:49:47.620559 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.156346 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:49:48 crc kubenswrapper[4759]: E1205 00:49:48.156604 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.211703 4759 generic.go:334] "Generic (PLEG): container finished" podID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerID="55e881d961135ec56ffaae608f20f8599d0be85929340ed4e67a4b19e1693f61" exitCode=0 Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.211766 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerDied","Data":"55e881d961135ec56ffaae608f20f8599d0be85929340ed4e67a4b19e1693f61"} Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.211821 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerStarted","Data":"64bdfc59d8c7a866c7ceef3d28a625a82ba0a986f056c1c7f4fb7fbc4dd6d524"} Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.338536 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.367520 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.367777 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-central-agent" containerID="cri-o://5d65692d8b3e6b75c1ba3daee41e2d256733b97789e9aa9bdded5fca431b4f97" gracePeriod=30 Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.367894 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="proxy-httpd" containerID="cri-o://143f6072234304eba121d335063837cc15dea13c3c22abd9cfda90d478fa9e19" gracePeriod=30 Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.367929 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="sg-core" containerID="cri-o://324b888f605a24c9effa764025721279a8154a57d044279ff27d66542fcae63c" gracePeriod=30 Dec 05 00:49:48 crc kubenswrapper[4759]: I1205 00:49:48.367959 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-notification-agent" containerID="cri-o://6698bd51a4d64dbab26494920e39e1b35b726ac23780c08c197f0392200f84d2" gracePeriod=30 Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.231831 4759 generic.go:334] "Generic (PLEG): container finished" podID="1a0a1242-dc05-4127-b052-94dec63f8703" containerID="143f6072234304eba121d335063837cc15dea13c3c22abd9cfda90d478fa9e19" exitCode=0 Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.232079 4759 generic.go:334] "Generic (PLEG): container finished" podID="1a0a1242-dc05-4127-b052-94dec63f8703" containerID="324b888f605a24c9effa764025721279a8154a57d044279ff27d66542fcae63c" exitCode=2 Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.232093 4759 generic.go:334] "Generic (PLEG): container finished" podID="1a0a1242-dc05-4127-b052-94dec63f8703" containerID="5d65692d8b3e6b75c1ba3daee41e2d256733b97789e9aa9bdded5fca431b4f97" exitCode=0 Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.232023 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerDied","Data":"143f6072234304eba121d335063837cc15dea13c3c22abd9cfda90d478fa9e19"} Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.232168 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerDied","Data":"324b888f605a24c9effa764025721279a8154a57d044279ff27d66542fcae63c"} Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.232183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerDied","Data":"5d65692d8b3e6b75c1ba3daee41e2d256733b97789e9aa9bdded5fca431b4f97"} Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.236666 4759 generic.go:334] "Generic (PLEG): container finished" podID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerID="c83f3d32474b3c646da6c46d72bb3c5c09d4fcf54de545f37d1c5420f77526c0" exitCode=0 Dec 05 00:49:49 crc kubenswrapper[4759]: I1205 00:49:49.236714 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerDied","Data":"c83f3d32474b3c646da6c46d72bb3c5c09d4fcf54de545f37d1c5420f77526c0"} Dec 05 00:49:50 crc kubenswrapper[4759]: I1205 00:49:50.269476 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerStarted","Data":"ca04ade59379a48ea6912bb16fd05a5d344ddd6cdc5f5a737972315b9d17009c"} Dec 05 00:49:50 crc kubenswrapper[4759]: I1205 00:49:50.292442 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l7pzq" podStartSLOduration=2.84769778 podStartE2EDuration="4.292425171s" podCreationTimestamp="2025-12-05 00:49:46 +0000 UTC" firstStartedPulling="2025-12-05 00:49:48.214144048 +0000 UTC m=+1607.429804998" lastFinishedPulling="2025-12-05 00:49:49.658871439 +0000 UTC m=+1608.874532389" observedRunningTime="2025-12-05 00:49:50.287536412 +0000 UTC m=+1609.503197362" watchObservedRunningTime="2025-12-05 00:49:50.292425171 +0000 UTC m=+1609.508086121" Dec 05 00:49:52 crc kubenswrapper[4759]: I1205 00:49:52.186346 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="rabbitmq" containerID="cri-o://4a743cc3904208a2b8acbf7150cd48744c98326fb8db6998d672e6baa70824e6" gracePeriod=604796 Dec 05 00:49:53 crc kubenswrapper[4759]: I1205 00:49:53.264062 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="rabbitmq" containerID="cri-o://b511a66d1e004c0f268b8e6d9b0e5bc1a7b577592e9070b3ac0d5b290ddc8e97" gracePeriod=604796 Dec 05 00:49:53 crc kubenswrapper[4759]: I1205 00:49:53.566630 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.236:3000/\": dial tcp 10.217.0.236:3000: connect: connection refused" Dec 05 00:49:54 crc kubenswrapper[4759]: I1205 00:49:54.313945 4759 generic.go:334] "Generic (PLEG): container finished" podID="1a0a1242-dc05-4127-b052-94dec63f8703" containerID="6698bd51a4d64dbab26494920e39e1b35b726ac23780c08c197f0392200f84d2" exitCode=0 Dec 05 00:49:54 crc kubenswrapper[4759]: I1205 00:49:54.313988 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerDied","Data":"6698bd51a4d64dbab26494920e39e1b35b726ac23780c08c197f0392200f84d2"} Dec 05 00:49:54 crc kubenswrapper[4759]: I1205 00:49:54.542236 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.118:5671: connect: connection refused" Dec 05 00:49:54 crc kubenswrapper[4759]: I1205 00:49:54.616363 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.119:5671: connect: connection refused" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.327824 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a0a1242-dc05-4127-b052-94dec63f8703","Type":"ContainerDied","Data":"f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5"} Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.328136 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f423a58ca77aa05de0057ced2256e775f873bfcb7d16ecc975918ff677bd55a5" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.362602 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.433913 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434060 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434098 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434139 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434170 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434198 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434241 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.434352 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd\") pod \"1a0a1242-dc05-4127-b052-94dec63f8703\" (UID: \"1a0a1242-dc05-4127-b052-94dec63f8703\") " Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.435325 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.435443 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.443613 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6" (OuterVolumeSpecName: "kube-api-access-xqfh6") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "kube-api-access-xqfh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.443738 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts" (OuterVolumeSpecName: "scripts") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.488097 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.518100 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536223 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536496 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536505 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a0a1242-dc05-4127-b052-94dec63f8703-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536514 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/1a0a1242-dc05-4127-b052-94dec63f8703-kube-api-access-xqfh6\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536525 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.536534 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.538356 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.587935 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data" (OuterVolumeSpecName: "config-data") pod "1a0a1242-dc05-4127-b052-94dec63f8703" (UID: "1a0a1242-dc05-4127-b052-94dec63f8703"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.638111 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:55 crc kubenswrapper[4759]: I1205 00:49:55.638140 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0a1242-dc05-4127-b052-94dec63f8703-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.337501 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.374043 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.387736 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.433258 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:49:56 crc kubenswrapper[4759]: E1205 00:49:56.435391 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="sg-core" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.435427 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="sg-core" Dec 05 00:49:56 crc kubenswrapper[4759]: E1205 00:49:56.435456 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="proxy-httpd" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.435464 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="proxy-httpd" Dec 05 00:49:56 crc kubenswrapper[4759]: E1205 00:49:56.435478 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-notification-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.435484 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-notification-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: E1205 00:49:56.435506 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-central-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.435512 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-central-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.436031 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-notification-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.436074 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="proxy-httpd" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.436089 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="sg-core" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.436105 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" containerName="ceilometer-central-agent" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.441116 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.443166 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.443513 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.443728 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.461419 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.564857 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565209 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565284 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565370 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565400 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6mb\" (UniqueName: \"kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565421 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565465 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.565528 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.667684 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.667810 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.667878 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.667936 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.668038 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.668143 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.668180 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6mb\" (UniqueName: \"kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.668208 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.669331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.669495 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.674646 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.675075 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.675393 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.675814 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.685358 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.687817 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6mb\" (UniqueName: \"kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb\") pod \"ceilometer-0\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.696005 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.696039 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.763985 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 00:49:56 crc kubenswrapper[4759]: I1205 00:49:56.770743 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:57 crc kubenswrapper[4759]: I1205 00:49:57.166998 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0a1242-dc05-4127-b052-94dec63f8703" path="/var/lib/kubelet/pods/1a0a1242-dc05-4127-b052-94dec63f8703/volumes" Dec 05 00:49:57 crc kubenswrapper[4759]: I1205 00:49:57.394823 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:49:57 crc kubenswrapper[4759]: I1205 00:49:57.445718 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:49:59 crc kubenswrapper[4759]: I1205 00:49:59.373595 4759 generic.go:334] "Generic (PLEG): container finished" podID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerID="4a743cc3904208a2b8acbf7150cd48744c98326fb8db6998d672e6baa70824e6" exitCode=0 Dec 05 00:49:59 crc kubenswrapper[4759]: I1205 00:49:59.373684 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerDied","Data":"4a743cc3904208a2b8acbf7150cd48744c98326fb8db6998d672e6baa70824e6"} Dec 05 00:49:59 crc kubenswrapper[4759]: I1205 00:49:59.374020 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l7pzq" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="registry-server" containerID="cri-o://ca04ade59379a48ea6912bb16fd05a5d344ddd6cdc5f5a737972315b9d17009c" gracePeriod=2 Dec 05 00:50:00 crc kubenswrapper[4759]: I1205 00:50:00.387641 4759 generic.go:334] "Generic (PLEG): container finished" podID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerID="ca04ade59379a48ea6912bb16fd05a5d344ddd6cdc5f5a737972315b9d17009c" exitCode=0 Dec 05 00:50:00 crc kubenswrapper[4759]: I1205 00:50:00.387667 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerDied","Data":"ca04ade59379a48ea6912bb16fd05a5d344ddd6cdc5f5a737972315b9d17009c"} Dec 05 00:50:00 crc kubenswrapper[4759]: I1205 00:50:00.389952 4759 generic.go:334] "Generic (PLEG): container finished" podID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerID="b511a66d1e004c0f268b8e6d9b0e5bc1a7b577592e9070b3ac0d5b290ddc8e97" exitCode=0 Dec 05 00:50:00 crc kubenswrapper[4759]: I1205 00:50:00.389999 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerDied","Data":"b511a66d1e004c0f268b8e6d9b0e5bc1a7b577592e9070b3ac0d5b290ddc8e97"} Dec 05 00:50:02 crc kubenswrapper[4759]: I1205 00:50:02.156611 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:50:02 crc kubenswrapper[4759]: E1205 00:50:02.157464 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.071732 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.077362 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.083689 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131747 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xct24\" (UniqueName: \"kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24\") pod \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131789 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content\") pod \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131844 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131864 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131890 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131909 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131938 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22z8j\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.131986 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132014 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132036 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132063 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132081 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132114 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132131 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fbp\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132149 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132189 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132204 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities\") pod \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\" (UID: \"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132235 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132258 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132293 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132340 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132365 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132386 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132426 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info\") pod \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\" (UID: \"9f171e66-8683-4856-8dd3-690bbdd0f6e5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.132443 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.134572 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.135758 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.138889 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.146756 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.148475 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24" (OuterVolumeSpecName: "kube-api-access-xct24") pod "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" (UID: "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b"). InnerVolumeSpecName "kube-api-access-xct24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.151732 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info" (OuterVolumeSpecName: "pod-info") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.152471 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.153132 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities" (OuterVolumeSpecName: "utilities") pod "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" (UID: "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.153671 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.156243 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info" (OuterVolumeSpecName: "pod-info") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.158464 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.159663 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.163905 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j" (OuterVolumeSpecName: "kube-api-access-22z8j") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "kube-api-access-22z8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.164158 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.168698 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.171650 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.173698 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.173866 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp" (OuterVolumeSpecName: "kube-api-access-79fbp") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "kube-api-access-79fbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.206907 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" (UID: "d9dfe817-0ae0-4895-9b5d-ad80ae8b102b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.215537 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data" (OuterVolumeSpecName: "config-data") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.224717 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data" (OuterVolumeSpecName: "config-data") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234423 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xct24\" (UniqueName: \"kubernetes.io/projected/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-kube-api-access-xct24\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234455 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234476 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234486 4759 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234496 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234509 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234519 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22z8j\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-kube-api-access-22z8j\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234528 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234537 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234547 4759 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f171e66-8683-4856-8dd3-690bbdd0f6e5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234555 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234563 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234571 4759 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234580 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fbp\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-kube-api-access-79fbp\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234589 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234597 4759 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a699edc7-e60f-482d-962d-6c69d625a1c5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234605 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234613 4759 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a699edc7-e60f-482d-962d-6c69d625a1c5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234620 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234628 4759 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f171e66-8683-4856-8dd3-690bbdd0f6e5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.234636 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.256875 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf" (OuterVolumeSpecName: "server-conf") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.327184 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.336016 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf" (OuterVolumeSpecName: "server-conf") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.342335 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") pod \"a699edc7-e60f-482d-962d-6c69d625a1c5\" (UID: \"a699edc7-e60f-482d-962d-6c69d625a1c5\") " Dec 05 00:50:03 crc kubenswrapper[4759]: W1205 00:50:03.343238 4759 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a699edc7-e60f-482d-962d-6c69d625a1c5/volumes/kubernetes.io~configmap/server-conf Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.343256 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf" (OuterVolumeSpecName: "server-conf") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.343678 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.343714 4759 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f171e66-8683-4856-8dd3-690bbdd0f6e5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.346198 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.386875 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9f171e66-8683-4856-8dd3-690bbdd0f6e5" (UID: "9f171e66-8683-4856-8dd3-690bbdd0f6e5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.421496 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a699edc7-e60f-482d-962d-6c69d625a1c5" (UID: "a699edc7-e60f-482d-962d-6c69d625a1c5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.445521 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a699edc7-e60f-482d-962d-6c69d625a1c5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.445564 4759 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a699edc7-e60f-482d-962d-6c69d625a1c5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.445578 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.445587 4759 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f171e66-8683-4856-8dd3-690bbdd0f6e5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.447147 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.447128 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f171e66-8683-4856-8dd3-690bbdd0f6e5","Type":"ContainerDied","Data":"a1465f49b94d3a279c12de5be1cfbd3addbf482cea648523cc4fe4d63455be09"} Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.447340 4759 scope.go:117] "RemoveContainer" containerID="4a743cc3904208a2b8acbf7150cd48744c98326fb8db6998d672e6baa70824e6" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.457705 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7pzq" event={"ID":"d9dfe817-0ae0-4895-9b5d-ad80ae8b102b","Type":"ContainerDied","Data":"64bdfc59d8c7a866c7ceef3d28a625a82ba0a986f056c1c7f4fb7fbc4dd6d524"} Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.457841 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7pzq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.468865 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a699edc7-e60f-482d-962d-6c69d625a1c5","Type":"ContainerDied","Data":"9464d48439fba77e99da0d777e6bcfb47ccaa8c5a1ea0daba13a9a8d8efeeb9b"} Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.468962 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.493795 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.515396 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7pzq"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.541198 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.563840 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.583404 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.593601 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609139 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609678 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="setup-container" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609696 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="setup-container" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609706 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609712 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609723 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="extract-utilities" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609728 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="extract-utilities" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609750 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="registry-server" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609756 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="registry-server" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609772 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609777 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609786 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="extract-content" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609792 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="extract-content" Dec 05 00:50:03 crc kubenswrapper[4759]: E1205 00:50:03.609802 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="setup-container" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.609808 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="setup-container" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.610030 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" containerName="registry-server" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.610051 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.610065 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" containerName="rabbitmq" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.611175 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613093 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613444 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613481 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613638 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613647 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613709 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.613813 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9xqwl" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.625216 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.639400 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.641342 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.643751 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.643776 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.643791 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.643956 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sxw7j" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.643981 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.644070 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.644236 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.650435 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651137 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvthz\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-kube-api-access-cvthz\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651174 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1cf2a4df-221d-4b0c-8a47-114deb1af60a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651224 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651267 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651285 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651323 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1cf2a4df-221d-4b0c-8a47-114deb1af60a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651347 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651406 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651431 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-config-data\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651500 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.651642 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.753669 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.753737 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.753779 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvthz\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-kube-api-access-cvthz\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.753938 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1cf2a4df-221d-4b0c-8a47-114deb1af60a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.754062 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.754279 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.754736 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.759649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.759719 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.759758 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1cf2a4df-221d-4b0c-8a47-114deb1af60a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.759820 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.759978 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/975c9850-0dc7-4b43-a521-015930850b0b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760064 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760110 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760162 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-config-data\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760186 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760298 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760422 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760468 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.760550 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.761028 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.761196 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.761278 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.761918 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.762495 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.764442 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cf2a4df-221d-4b0c-8a47-114deb1af60a-config-data\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.765038 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.765086 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r842n\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-kube-api-access-r842n\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.765111 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/975c9850-0dc7-4b43-a521-015930850b0b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.766062 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.766395 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1cf2a4df-221d-4b0c-8a47-114deb1af60a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.770709 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1cf2a4df-221d-4b0c-8a47-114deb1af60a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.788177 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvthz\" (UniqueName: \"kubernetes.io/projected/1cf2a4df-221d-4b0c-8a47-114deb1af60a-kube-api-access-cvthz\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.798045 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1cf2a4df-221d-4b0c-8a47-114deb1af60a\") " pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867168 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/975c9850-0dc7-4b43-a521-015930850b0b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867212 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867270 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867292 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867323 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867351 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r842n\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-kube-api-access-r842n\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867380 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/975c9850-0dc7-4b43-a521-015930850b0b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867399 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867426 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.867455 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.869043 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.869135 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.869579 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.870223 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.870622 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/975c9850-0dc7-4b43-a521-015930850b0b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.871668 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.876022 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/975c9850-0dc7-4b43-a521-015930850b0b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.877162 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.877577 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.886546 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/975c9850-0dc7-4b43-a521-015930850b0b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.902039 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r842n\" (UniqueName: \"kubernetes.io/projected/975c9850-0dc7-4b43-a521-015930850b0b-kube-api-access-r842n\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.905444 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"975c9850-0dc7-4b43-a521-015930850b0b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.931291 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 00:50:03 crc kubenswrapper[4759]: I1205 00:50:03.960981 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.171727 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f171e66-8683-4856-8dd3-690bbdd0f6e5" path="/var/lib/kubelet/pods/9f171e66-8683-4856-8dd3-690bbdd0f6e5/volumes" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.172569 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a699edc7-e60f-482d-962d-6c69d625a1c5" path="/var/lib/kubelet/pods/a699edc7-e60f-482d-962d-6c69d625a1c5/volumes" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.173867 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9dfe817-0ae0-4895-9b5d-ad80ae8b102b" path="/var/lib/kubelet/pods/d9dfe817-0ae0-4895-9b5d-ad80ae8b102b/volumes" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.250273 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.251917 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.254072 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.266295 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298207 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298279 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298392 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298616 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6wj\" (UniqueName: \"kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298662 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298823 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.298936 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405385 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6wj\" (UniqueName: \"kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405746 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405787 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405828 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405854 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.405896 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.407477 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.408034 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.408598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.409097 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.409343 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.409834 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.426266 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6wj\" (UniqueName: \"kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj\") pod \"dnsmasq-dns-594cb89c79-gltbn\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:05 crc kubenswrapper[4759]: I1205 00:50:05.614546 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:08 crc kubenswrapper[4759]: I1205 00:50:08.163627 4759 scope.go:117] "RemoveContainer" containerID="0a18484c4c14989b6cd1995e3595bcffb8b46d6d995e6356081f751cb7815390" Dec 05 00:50:08 crc kubenswrapper[4759]: I1205 00:50:08.635525 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 00:50:08 crc kubenswrapper[4759]: W1205 00:50:08.842209 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16466e0b_7a83_46aa_b39e_d52ea5c19f86.slice/crio-ce9e40465373db2bd43b35c82ca42e76ce4a834a9827342f1daa370a62a6c401 WatchSource:0}: Error finding container ce9e40465373db2bd43b35c82ca42e76ce4a834a9827342f1daa370a62a6c401: Status 404 returned error can't find the container with id ce9e40465373db2bd43b35c82ca42e76ce4a834a9827342f1daa370a62a6c401 Dec 05 00:50:08 crc kubenswrapper[4759]: E1205 00:50:08.890446 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 05 00:50:08 crc kubenswrapper[4759]: E1205 00:50:08.890501 4759 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 05 00:50:08 crc kubenswrapper[4759]: E1205 00:50:08.890628 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7p8rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-q2dv8_openstack(f827c4c1-9ca4-455c-9e82-67ee6d95f5fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 00:50:08 crc kubenswrapper[4759]: E1205 00:50:08.892463 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-q2dv8" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" Dec 05 00:50:08 crc kubenswrapper[4759]: I1205 00:50:08.895823 4759 scope.go:117] "RemoveContainer" containerID="ca04ade59379a48ea6912bb16fd05a5d344ddd6cdc5f5a737972315b9d17009c" Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.121031 4759 scope.go:117] "RemoveContainer" containerID="c83f3d32474b3c646da6c46d72bb3c5c09d4fcf54de545f37d1c5420f77526c0" Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.273706 4759 scope.go:117] "RemoveContainer" containerID="55e881d961135ec56ffaae608f20f8599d0be85929340ed4e67a4b19e1693f61" Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.369115 4759 scope.go:117] "RemoveContainer" containerID="b511a66d1e004c0f268b8e6d9b0e5bc1a7b577592e9070b3ac0d5b290ddc8e97" Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.382803 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.540627 4759 scope.go:117] "RemoveContainer" containerID="60e223bb18ce845cab8ddf921004958f044d69a60364af0d0f26073750759ea4" Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.547794 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"975c9850-0dc7-4b43-a521-015930850b0b","Type":"ContainerStarted","Data":"0fcf04908246bd00adb4309da125e505a0ff8f17a30c7435d3c04265f6f01230"} Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.549562 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerStarted","Data":"ce9e40465373db2bd43b35c82ca42e76ce4a834a9827342f1daa370a62a6c401"} Dec 05 00:50:09 crc kubenswrapper[4759]: E1205 00:50:09.554749 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-q2dv8" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" Dec 05 00:50:09 crc kubenswrapper[4759]: W1205 00:50:09.663609 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf2a4df_221d_4b0c_8a47_114deb1af60a.slice/crio-1483d05e7ef16837b4185bddbb950b106a6cd318b3e766c50869b59d893963b4 WatchSource:0}: Error finding container 1483d05e7ef16837b4185bddbb950b106a6cd318b3e766c50869b59d893963b4: Status 404 returned error can't find the container with id 1483d05e7ef16837b4185bddbb950b106a6cd318b3e766c50869b59d893963b4 Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.665335 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 00:50:09 crc kubenswrapper[4759]: I1205 00:50:09.680334 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:10 crc kubenswrapper[4759]: I1205 00:50:10.569147 4759 generic.go:334] "Generic (PLEG): container finished" podID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerID="5431d50628bf91b90828b410a088b4fa2b7758e7d43ed839f903d2b0d4d93eab" exitCode=0 Dec 05 00:50:10 crc kubenswrapper[4759]: I1205 00:50:10.569352 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" event={"ID":"97f43b1a-bc7f-4928-964c-94b1ce1c49f0","Type":"ContainerDied","Data":"5431d50628bf91b90828b410a088b4fa2b7758e7d43ed839f903d2b0d4d93eab"} Dec 05 00:50:10 crc kubenswrapper[4759]: I1205 00:50:10.569473 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" event={"ID":"97f43b1a-bc7f-4928-964c-94b1ce1c49f0","Type":"ContainerStarted","Data":"6775d130c6282708f03cd2754f2fdb20c0315e37dd500bcae20c113c493aa8d0"} Dec 05 00:50:10 crc kubenswrapper[4759]: I1205 00:50:10.570488 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1cf2a4df-221d-4b0c-8a47-114deb1af60a","Type":"ContainerStarted","Data":"1483d05e7ef16837b4185bddbb950b106a6cd318b3e766c50869b59d893963b4"} Dec 05 00:50:11 crc kubenswrapper[4759]: I1205 00:50:11.598111 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1cf2a4df-221d-4b0c-8a47-114deb1af60a","Type":"ContainerStarted","Data":"a7334a565975994640a90157711f36a818a19cefb6c0ed333b27afc43c322559"} Dec 05 00:50:11 crc kubenswrapper[4759]: I1205 00:50:11.599911 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"975c9850-0dc7-4b43-a521-015930850b0b","Type":"ContainerStarted","Data":"e559c57fd8958e9eadb3b4991ac94574d213352bdb0ddce9e687fd666b9aacbf"} Dec 05 00:50:13 crc kubenswrapper[4759]: I1205 00:50:13.623846 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" event={"ID":"97f43b1a-bc7f-4928-964c-94b1ce1c49f0","Type":"ContainerStarted","Data":"0fcc0646f7e0c1ef3777d3c31f7765f5c2c3b034473e379351e3d34a39065431"} Dec 05 00:50:13 crc kubenswrapper[4759]: I1205 00:50:13.624323 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:13 crc kubenswrapper[4759]: I1205 00:50:13.624989 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerStarted","Data":"05370620c000aa88c690b6ef35efbe18b8c2aa65f0860925c75e73bd24eb9936"} Dec 05 00:50:13 crc kubenswrapper[4759]: I1205 00:50:13.671492 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" podStartSLOduration=8.67147484 podStartE2EDuration="8.67147484s" podCreationTimestamp="2025-12-05 00:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:50:13.666635901 +0000 UTC m=+1632.882296851" watchObservedRunningTime="2025-12-05 00:50:13.67147484 +0000 UTC m=+1632.887135790" Dec 05 00:50:14 crc kubenswrapper[4759]: I1205 00:50:14.157214 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:50:14 crc kubenswrapper[4759]: E1205 00:50:14.157494 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:50:16 crc kubenswrapper[4759]: I1205 00:50:16.657480 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerStarted","Data":"bc664683605fb9335216ce108d0e079d867b8ade4ade94f0f1b61df8190d35c9"} Dec 05 00:50:17 crc kubenswrapper[4759]: I1205 00:50:17.678142 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerStarted","Data":"1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940"} Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.616519 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.702054 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.702352 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="dnsmasq-dns" containerID="cri-o://3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b" gracePeriod=10 Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.749242 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerStarted","Data":"7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e"} Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.750169 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.775644 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=13.398328958 podStartE2EDuration="24.775622687s" podCreationTimestamp="2025-12-05 00:49:56 +0000 UTC" firstStartedPulling="2025-12-05 00:50:08.854283031 +0000 UTC m=+1628.069943991" lastFinishedPulling="2025-12-05 00:50:20.23157677 +0000 UTC m=+1639.447237720" observedRunningTime="2025-12-05 00:50:20.77287835 +0000 UTC m=+1639.988539300" watchObservedRunningTime="2025-12-05 00:50:20.775622687 +0000 UTC m=+1639.991283637" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.908742 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.910993 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.930554 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991462 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991515 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991533 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991556 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991656 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8fw4\" (UniqueName: \"kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991676 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:20 crc kubenswrapper[4759]: I1205 00:50:20.991692 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.093873 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094278 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094298 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094332 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094418 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8fw4\" (UniqueName: \"kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094443 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.094461 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.095155 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.095260 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.095716 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.096837 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.096938 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.096942 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.113586 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8fw4\" (UniqueName: \"kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4\") pod \"dnsmasq-dns-6dd9576ff-cgflc\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.236656 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.245713 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.402684 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t82x8\" (UniqueName: \"kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.402959 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.403008 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.403086 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.403117 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.403206 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config\") pod \"c361255a-5c87-4fcb-81a0-e5160580cc33\" (UID: \"c361255a-5c87-4fcb-81a0-e5160580cc33\") " Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.423323 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8" (OuterVolumeSpecName: "kube-api-access-t82x8") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "kube-api-access-t82x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.483889 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config" (OuterVolumeSpecName: "config") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.506833 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t82x8\" (UniqueName: \"kubernetes.io/projected/c361255a-5c87-4fcb-81a0-e5160580cc33-kube-api-access-t82x8\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.506854 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.520797 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.560174 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.579105 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.608633 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.608663 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.608673 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.624815 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c361255a-5c87-4fcb-81a0-e5160580cc33" (UID: "c361255a-5c87-4fcb-81a0-e5160580cc33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.710043 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c361255a-5c87-4fcb-81a0-e5160580cc33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.764353 4759 generic.go:334] "Generic (PLEG): container finished" podID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerID="3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b" exitCode=0 Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.765659 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.765953 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" event={"ID":"c361255a-5c87-4fcb-81a0-e5160580cc33","Type":"ContainerDied","Data":"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b"} Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.766015 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-x7rsx" event={"ID":"c361255a-5c87-4fcb-81a0-e5160580cc33","Type":"ContainerDied","Data":"2d359aead8b6d187e612d5ff30b7f687bd4a1f34e82cc85da4c4bc7a467ca8d6"} Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.766036 4759 scope.go:117] "RemoveContainer" containerID="3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.795147 4759 scope.go:117] "RemoveContainer" containerID="a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.817232 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.832253 4759 scope.go:117] "RemoveContainer" containerID="3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b" Dec 05 00:50:21 crc kubenswrapper[4759]: E1205 00:50:21.834512 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b\": container with ID starting with 3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b not found: ID does not exist" containerID="3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.834554 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b"} err="failed to get container status \"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b\": rpc error: code = NotFound desc = could not find container \"3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b\": container with ID starting with 3d8f12f7d0db48e4f1068e3bf95565b26b0ca83f3ee41ed75d9dccb9c8140d9b not found: ID does not exist" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.837087 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-x7rsx"] Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.834592 4759 scope.go:117] "RemoveContainer" containerID="a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016" Dec 05 00:50:21 crc kubenswrapper[4759]: E1205 00:50:21.838410 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016\": container with ID starting with a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016 not found: ID does not exist" containerID="a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016" Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.838434 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016"} err="failed to get container status \"a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016\": rpc error: code = NotFound desc = could not find container \"a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016\": container with ID starting with a33482ebef3cd74043accf1d72bfd7580372558ceb256ecb52c51bdce9e7a016 not found: ID does not exist" Dec 05 00:50:21 crc kubenswrapper[4759]: W1205 00:50:21.845465 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9542f289_2a5b_4593_8cf5_d43690c6440e.slice/crio-763d8c1878c0f4c01c451ce9b002dd29b518ee827bc4274fb1b42df31e1b833d WatchSource:0}: Error finding container 763d8c1878c0f4c01c451ce9b002dd29b518ee827bc4274fb1b42df31e1b833d: Status 404 returned error can't find the container with id 763d8c1878c0f4c01c451ce9b002dd29b518ee827bc4274fb1b42df31e1b833d Dec 05 00:50:21 crc kubenswrapper[4759]: I1205 00:50:21.847160 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 00:50:21 crc kubenswrapper[4759]: E1205 00:50:21.948949 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc361255a_5c87_4fcb_81a0_e5160580cc33.slice\": RecentStats: unable to find data in memory cache]" Dec 05 00:50:22 crc kubenswrapper[4759]: I1205 00:50:22.778870 4759 generic.go:334] "Generic (PLEG): container finished" podID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerID="6d9b7cb37dcdd232a7bcf4c16615757e738bb0dddf7f7faf35be90aeaea8e838" exitCode=0 Dec 05 00:50:22 crc kubenswrapper[4759]: I1205 00:50:22.778979 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" event={"ID":"9542f289-2a5b-4593-8cf5-d43690c6440e","Type":"ContainerDied","Data":"6d9b7cb37dcdd232a7bcf4c16615757e738bb0dddf7f7faf35be90aeaea8e838"} Dec 05 00:50:22 crc kubenswrapper[4759]: I1205 00:50:22.779209 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" event={"ID":"9542f289-2a5b-4593-8cf5-d43690c6440e","Type":"ContainerStarted","Data":"763d8c1878c0f4c01c451ce9b002dd29b518ee827bc4274fb1b42df31e1b833d"} Dec 05 00:50:22 crc kubenswrapper[4759]: I1205 00:50:22.782562 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q2dv8" event={"ID":"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd","Type":"ContainerStarted","Data":"5e40b7708eb1143bd097a614bd1b2e025aedb1b829aeefe678d239f283ef0b8a"} Dec 05 00:50:22 crc kubenswrapper[4759]: I1205 00:50:22.840493 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-q2dv8" podStartSLOduration=1.945120548 podStartE2EDuration="37.840454001s" podCreationTimestamp="2025-12-05 00:49:45 +0000 UTC" firstStartedPulling="2025-12-05 00:49:46.480701657 +0000 UTC m=+1605.696362597" lastFinishedPulling="2025-12-05 00:50:22.37603509 +0000 UTC m=+1641.591696050" observedRunningTime="2025-12-05 00:50:22.831649155 +0000 UTC m=+1642.047310115" watchObservedRunningTime="2025-12-05 00:50:22.840454001 +0000 UTC m=+1642.056114951" Dec 05 00:50:23 crc kubenswrapper[4759]: I1205 00:50:23.169607 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" path="/var/lib/kubelet/pods/c361255a-5c87-4fcb-81a0-e5160580cc33/volumes" Dec 05 00:50:24 crc kubenswrapper[4759]: I1205 00:50:24.820013 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" event={"ID":"9542f289-2a5b-4593-8cf5-d43690c6440e","Type":"ContainerStarted","Data":"26067cd9ec606d5cee224ef399c9cf32d94ed7ef69e2eee1f341554299e7944f"} Dec 05 00:50:24 crc kubenswrapper[4759]: I1205 00:50:24.820369 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:24 crc kubenswrapper[4759]: I1205 00:50:24.840813 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" podStartSLOduration=4.84079357 podStartE2EDuration="4.84079357s" podCreationTimestamp="2025-12-05 00:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:50:24.837572771 +0000 UTC m=+1644.053233711" watchObservedRunningTime="2025-12-05 00:50:24.84079357 +0000 UTC m=+1644.056454520" Dec 05 00:50:25 crc kubenswrapper[4759]: I1205 00:50:25.837541 4759 generic.go:334] "Generic (PLEG): container finished" podID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" containerID="5e40b7708eb1143bd097a614bd1b2e025aedb1b829aeefe678d239f283ef0b8a" exitCode=0 Dec 05 00:50:25 crc kubenswrapper[4759]: I1205 00:50:25.837611 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q2dv8" event={"ID":"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd","Type":"ContainerDied","Data":"5e40b7708eb1143bd097a614bd1b2e025aedb1b829aeefe678d239f283ef0b8a"} Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.156891 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:50:27 crc kubenswrapper[4759]: E1205 00:50:27.157508 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.364955 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q2dv8" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.535969 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p8rm\" (UniqueName: \"kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm\") pod \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.536164 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data\") pod \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.536216 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle\") pod \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\" (UID: \"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd\") " Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.544188 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm" (OuterVolumeSpecName: "kube-api-access-7p8rm") pod "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" (UID: "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd"). InnerVolumeSpecName "kube-api-access-7p8rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.571145 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" (UID: "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.617851 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data" (OuterVolumeSpecName: "config-data") pod "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" (UID: "f827c4c1-9ca4-455c-9e82-67ee6d95f5fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.638825 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p8rm\" (UniqueName: \"kubernetes.io/projected/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-kube-api-access-7p8rm\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.638872 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.638882 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.876233 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q2dv8" event={"ID":"f827c4c1-9ca4-455c-9e82-67ee6d95f5fd","Type":"ContainerDied","Data":"60c25533e1859fc94306417b1e465b30b61f7580436684917c956ea2850af615"} Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.876277 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c25533e1859fc94306417b1e465b30b61f7580436684917c956ea2850af615" Dec 05 00:50:27 crc kubenswrapper[4759]: I1205 00:50:27.876353 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q2dv8" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.177659 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-85bff75774-kcbvj"] Dec 05 00:50:29 crc kubenswrapper[4759]: E1205 00:50:29.178618 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="init" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.178641 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="init" Dec 05 00:50:29 crc kubenswrapper[4759]: E1205 00:50:29.178664 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="dnsmasq-dns" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.178677 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="dnsmasq-dns" Dec 05 00:50:29 crc kubenswrapper[4759]: E1205 00:50:29.178714 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" containerName="heat-db-sync" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.178724 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" containerName="heat-db-sync" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.179105 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" containerName="heat-db-sync" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.179154 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c361255a-5c87-4fcb-81a0-e5160580cc33" containerName="dnsmasq-dns" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.180374 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.196772 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85bff75774-kcbvj"] Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.272507 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.272584 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm97n\" (UniqueName: \"kubernetes.io/projected/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-kube-api-access-lm97n\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.272751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-combined-ca-bundle\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.272835 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data-custom\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.305146 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7666bd695c-zdxtn"] Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.306909 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.313913 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5fd8599d54-rjclc"] Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.315857 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.328500 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7666bd695c-zdxtn"] Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.338311 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fd8599d54-rjclc"] Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.374793 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-combined-ca-bundle\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.374982 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-combined-ca-bundle\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375011 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data-custom\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375029 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375083 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-public-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375109 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375135 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-internal-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375172 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data-custom\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375201 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm97n\" (UniqueName: \"kubernetes.io/projected/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-kube-api-access-lm97n\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.375231 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbbc\" (UniqueName: \"kubernetes.io/projected/93df44aa-16c3-4374-a75a-440bc03ba2cd-kube-api-access-dkbbc\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.379691 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-combined-ca-bundle\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.379969 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data-custom\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.384607 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-config-data\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.396436 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm97n\" (UniqueName: \"kubernetes.io/projected/75ee8344-adba-4c6d-83a2-52e1e8ce15e7-kube-api-access-lm97n\") pod \"heat-engine-85bff75774-kcbvj\" (UID: \"75ee8344-adba-4c6d-83a2-52e1e8ce15e7\") " pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477267 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data-custom\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477343 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-internal-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477400 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-combined-ca-bundle\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477436 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477471 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfp7n\" (UniqueName: \"kubernetes.io/projected/c6dff63f-0a5b-4f52-a373-39a85e77df1e-kube-api-access-cfp7n\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477497 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-public-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477539 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-public-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477570 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-internal-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477602 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477623 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data-custom\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477660 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-combined-ca-bundle\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.477688 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbbc\" (UniqueName: \"kubernetes.io/projected/93df44aa-16c3-4374-a75a-440bc03ba2cd-kube-api-access-dkbbc\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.482920 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-public-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.483438 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-combined-ca-bundle\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.483495 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-internal-tls-certs\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.486444 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data-custom\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.486798 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93df44aa-16c3-4374-a75a-440bc03ba2cd-config-data\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.491941 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbbc\" (UniqueName: \"kubernetes.io/projected/93df44aa-16c3-4374-a75a-440bc03ba2cd-kube-api-access-dkbbc\") pod \"heat-api-7666bd695c-zdxtn\" (UID: \"93df44aa-16c3-4374-a75a-440bc03ba2cd\") " pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.501016 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579229 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579529 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-combined-ca-bundle\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579603 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data-custom\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579634 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-internal-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579678 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfp7n\" (UniqueName: \"kubernetes.io/projected/c6dff63f-0a5b-4f52-a373-39a85e77df1e-kube-api-access-cfp7n\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.579701 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-public-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.585449 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-combined-ca-bundle\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.592897 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-public-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.599588 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-internal-tls-certs\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.601697 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.606804 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfp7n\" (UniqueName: \"kubernetes.io/projected/c6dff63f-0a5b-4f52-a373-39a85e77df1e-kube-api-access-cfp7n\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.609676 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6dff63f-0a5b-4f52-a373-39a85e77df1e-config-data-custom\") pod \"heat-cfnapi-5fd8599d54-rjclc\" (UID: \"c6dff63f-0a5b-4f52-a373-39a85e77df1e\") " pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.634980 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.646837 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:29 crc kubenswrapper[4759]: W1205 00:50:29.985365 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ee8344_adba_4c6d_83a2_52e1e8ce15e7.slice/crio-244ad406351e2f3c79e30d2eaa643666a61a820aaedf48b05fd1a420ebe050e4 WatchSource:0}: Error finding container 244ad406351e2f3c79e30d2eaa643666a61a820aaedf48b05fd1a420ebe050e4: Status 404 returned error can't find the container with id 244ad406351e2f3c79e30d2eaa643666a61a820aaedf48b05fd1a420ebe050e4 Dec 05 00:50:29 crc kubenswrapper[4759]: I1205 00:50:29.988889 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85bff75774-kcbvj"] Dec 05 00:50:30 crc kubenswrapper[4759]: W1205 00:50:30.148969 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93df44aa_16c3_4374_a75a_440bc03ba2cd.slice/crio-c92389bce67d094a3febe62cf655b52931f6cffe45138a72d6fe3fdc61f76890 WatchSource:0}: Error finding container c92389bce67d094a3febe62cf655b52931f6cffe45138a72d6fe3fdc61f76890: Status 404 returned error can't find the container with id c92389bce67d094a3febe62cf655b52931f6cffe45138a72d6fe3fdc61f76890 Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.151284 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.153522 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7666bd695c-zdxtn"] Dec 05 00:50:30 crc kubenswrapper[4759]: W1205 00:50:30.280341 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6dff63f_0a5b_4f52_a373_39a85e77df1e.slice/crio-9f1e4c9c09330d6da5a47e58ae1b35494621b30317f8de2b4ec31827f06e82df WatchSource:0}: Error finding container 9f1e4c9c09330d6da5a47e58ae1b35494621b30317f8de2b4ec31827f06e82df: Status 404 returned error can't find the container with id 9f1e4c9c09330d6da5a47e58ae1b35494621b30317f8de2b4ec31827f06e82df Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.280483 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fd8599d54-rjclc"] Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.920953 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" event={"ID":"c6dff63f-0a5b-4f52-a373-39a85e77df1e","Type":"ContainerStarted","Data":"9f1e4c9c09330d6da5a47e58ae1b35494621b30317f8de2b4ec31827f06e82df"} Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.924924 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7666bd695c-zdxtn" event={"ID":"93df44aa-16c3-4374-a75a-440bc03ba2cd","Type":"ContainerStarted","Data":"c92389bce67d094a3febe62cf655b52931f6cffe45138a72d6fe3fdc61f76890"} Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.927051 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85bff75774-kcbvj" event={"ID":"75ee8344-adba-4c6d-83a2-52e1e8ce15e7","Type":"ContainerStarted","Data":"dcc8f764d7eda4ca27bcda8f68fdcda954966222dfeaf262bb73414d95c510b7"} Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.927075 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85bff75774-kcbvj" event={"ID":"75ee8344-adba-4c6d-83a2-52e1e8ce15e7","Type":"ContainerStarted","Data":"244ad406351e2f3c79e30d2eaa643666a61a820aaedf48b05fd1a420ebe050e4"} Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.928649 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:30 crc kubenswrapper[4759]: I1205 00:50:30.956071 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-85bff75774-kcbvj" podStartSLOduration=1.956052298 podStartE2EDuration="1.956052298s" podCreationTimestamp="2025-12-05 00:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:50:30.94793016 +0000 UTC m=+1650.163591110" watchObservedRunningTime="2025-12-05 00:50:30.956052298 +0000 UTC m=+1650.171713248" Dec 05 00:50:31 crc kubenswrapper[4759]: I1205 00:50:31.249592 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 00:50:31 crc kubenswrapper[4759]: I1205 00:50:31.345785 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:31 crc kubenswrapper[4759]: I1205 00:50:31.346087 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="dnsmasq-dns" containerID="cri-o://0fcc0646f7e0c1ef3777d3c31f7765f5c2c3b034473e379351e3d34a39065431" gracePeriod=10 Dec 05 00:50:31 crc kubenswrapper[4759]: I1205 00:50:31.947689 4759 generic.go:334] "Generic (PLEG): container finished" podID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerID="0fcc0646f7e0c1ef3777d3c31f7765f5c2c3b034473e379351e3d34a39065431" exitCode=0 Dec 05 00:50:31 crc kubenswrapper[4759]: I1205 00:50:31.947772 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" event={"ID":"97f43b1a-bc7f-4928-964c-94b1ce1c49f0","Type":"ContainerDied","Data":"0fcc0646f7e0c1ef3777d3c31f7765f5c2c3b034473e379351e3d34a39065431"} Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.237828 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.347873 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.347949 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.348029 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.348125 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.348215 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.348244 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.348358 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw6wj\" (UniqueName: \"kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj\") pod \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\" (UID: \"97f43b1a-bc7f-4928-964c-94b1ce1c49f0\") " Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.353128 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj" (OuterVolumeSpecName: "kube-api-access-jw6wj") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "kube-api-access-jw6wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.419515 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.428993 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config" (OuterVolumeSpecName: "config") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.445141 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.450801 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.450834 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.450872 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw6wj\" (UniqueName: \"kubernetes.io/projected/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-kube-api-access-jw6wj\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.450886 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-config\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.460116 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.483884 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.491755 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97f43b1a-bc7f-4928-964c-94b1ce1c49f0" (UID: "97f43b1a-bc7f-4928-964c-94b1ce1c49f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.552619 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.552652 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.552662 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97f43b1a-bc7f-4928-964c-94b1ce1c49f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.959683 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" event={"ID":"97f43b1a-bc7f-4928-964c-94b1ce1c49f0","Type":"ContainerDied","Data":"6775d130c6282708f03cd2754f2fdb20c0315e37dd500bcae20c113c493aa8d0"} Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.960053 4759 scope.go:117] "RemoveContainer" containerID="0fcc0646f7e0c1ef3777d3c31f7765f5c2c3b034473e379351e3d34a39065431" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.959774 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-gltbn" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.961423 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7666bd695c-zdxtn" event={"ID":"93df44aa-16c3-4374-a75a-440bc03ba2cd","Type":"ContainerStarted","Data":"a883f518188f8449f3d3abdcd5c8d30edeba60000128f789813efa65d920c9f6"} Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.961565 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.964838 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" event={"ID":"c6dff63f-0a5b-4f52-a373-39a85e77df1e","Type":"ContainerStarted","Data":"e1d37e296fe90287b5da40c761bc04802b1ef93050a2b472d12d35a8731115c8"} Dec 05 00:50:32 crc kubenswrapper[4759]: I1205 00:50:32.986028 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" podStartSLOduration=2.3416857970000002 podStartE2EDuration="3.986006031s" podCreationTimestamp="2025-12-05 00:50:29 +0000 UTC" firstStartedPulling="2025-12-05 00:50:30.285109474 +0000 UTC m=+1649.500770424" lastFinishedPulling="2025-12-05 00:50:31.929429708 +0000 UTC m=+1651.145090658" observedRunningTime="2025-12-05 00:50:32.98394069 +0000 UTC m=+1652.199601640" watchObservedRunningTime="2025-12-05 00:50:32.986006031 +0000 UTC m=+1652.201666981" Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.042733 4759 scope.go:117] "RemoveContainer" containerID="5431d50628bf91b90828b410a088b4fa2b7758e7d43ed839f903d2b0d4d93eab" Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.051968 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7666bd695c-zdxtn" podStartSLOduration=2.282790408 podStartE2EDuration="4.051950551s" podCreationTimestamp="2025-12-05 00:50:29 +0000 UTC" firstStartedPulling="2025-12-05 00:50:30.15107348 +0000 UTC m=+1649.366734420" lastFinishedPulling="2025-12-05 00:50:31.920233613 +0000 UTC m=+1651.135894563" observedRunningTime="2025-12-05 00:50:33.049538312 +0000 UTC m=+1652.265199262" watchObservedRunningTime="2025-12-05 00:50:33.051950551 +0000 UTC m=+1652.267611501" Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.078860 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.089818 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-gltbn"] Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.166074 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" path="/var/lib/kubelet/pods/97f43b1a-bc7f-4928-964c-94b1ce1c49f0/volumes" Dec 05 00:50:33 crc kubenswrapper[4759]: I1205 00:50:33.983410 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.489437 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq"] Dec 05 00:50:39 crc kubenswrapper[4759]: E1205 00:50:39.491633 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="dnsmasq-dns" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.491738 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="dnsmasq-dns" Dec 05 00:50:39 crc kubenswrapper[4759]: E1205 00:50:39.491847 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="init" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.491920 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="init" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.492278 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f43b1a-bc7f-4928-964c-94b1ce1c49f0" containerName="dnsmasq-dns" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.493213 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.497345 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.498495 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.498880 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.499049 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.510034 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq"] Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.601825 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.601952 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.602038 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbm4\" (UniqueName: \"kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.602128 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.704010 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.704246 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.704299 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.704363 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phbm4\" (UniqueName: \"kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.711089 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.712462 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.721199 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.751853 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phbm4\" (UniqueName: \"kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:39 crc kubenswrapper[4759]: I1205 00:50:39.820627 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:50:40 crc kubenswrapper[4759]: I1205 00:50:40.554294 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq"] Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.046963 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7666bd695c-zdxtn" Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.119549 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5fd8599d54-rjclc" Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.139359 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.139651 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerName="heat-api" containerID="cri-o://cd3a3dfd7fc59e402cad9adc2d3b1b4022e617b64dedf9e6dfe8c29e3cfdeb8f" gracePeriod=60 Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.163269 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:50:41 crc kubenswrapper[4759]: E1205 00:50:41.163643 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.254560 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.254793 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-dc865dc89-pdskl" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerName="heat-cfnapi" containerID="cri-o://5edca824b8e8174871bb63690d372a2cbd59799f6282f923bdd89815c9b81820" gracePeriod=60 Dec 05 00:50:41 crc kubenswrapper[4759]: I1205 00:50:41.259796 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" event={"ID":"4ec53225-5ccb-4be7-af07-c86a1931fea9","Type":"ContainerStarted","Data":"a82163d7c02fa9abd8903b17ab05d0fca8225757f3cb22c9fbf25a7531c19131"} Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.294850 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.200:8004/healthcheck\": read tcp 10.217.0.2:50626->10.217.0.200:8004: read: connection reset by peer" Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.343695 4759 generic.go:334] "Generic (PLEG): container finished" podID="975c9850-0dc7-4b43-a521-015930850b0b" containerID="e559c57fd8958e9eadb3b4991ac94574d213352bdb0ddce9e687fd666b9aacbf" exitCode=0 Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.343886 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"975c9850-0dc7-4b43-a521-015930850b0b","Type":"ContainerDied","Data":"e559c57fd8958e9eadb3b4991ac94574d213352bdb0ddce9e687fd666b9aacbf"} Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.350227 4759 generic.go:334] "Generic (PLEG): container finished" podID="1cf2a4df-221d-4b0c-8a47-114deb1af60a" containerID="a7334a565975994640a90157711f36a818a19cefb6c0ed333b27afc43c322559" exitCode=0 Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.350261 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1cf2a4df-221d-4b0c-8a47-114deb1af60a","Type":"ContainerDied","Data":"a7334a565975994640a90157711f36a818a19cefb6c0ed333b27afc43c322559"} Dec 05 00:50:44 crc kubenswrapper[4759]: I1205 00:50:44.406464 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-dc865dc89-pdskl" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.201:8000/healthcheck\": read tcp 10.217.0.2:39034->10.217.0.201:8000: read: connection reset by peer" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.359725 4759 generic.go:334] "Generic (PLEG): container finished" podID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerID="5edca824b8e8174871bb63690d372a2cbd59799f6282f923bdd89815c9b81820" exitCode=0 Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.359833 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc865dc89-pdskl" event={"ID":"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e","Type":"ContainerDied","Data":"5edca824b8e8174871bb63690d372a2cbd59799f6282f923bdd89815c9b81820"} Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.362691 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1cf2a4df-221d-4b0c-8a47-114deb1af60a","Type":"ContainerStarted","Data":"53dd926d66e894dd81be39915ed9c48c8582d7ab2db7a7df0de63593000fb7ff"} Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.362908 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.386571 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"975c9850-0dc7-4b43-a521-015930850b0b","Type":"ContainerStarted","Data":"468a9906adb1483d67795315743cc353121ba5ef6948f13bb8206f046f2441d9"} Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.387523 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.392200 4759 generic.go:334] "Generic (PLEG): container finished" podID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerID="cd3a3dfd7fc59e402cad9adc2d3b1b4022e617b64dedf9e6dfe8c29e3cfdeb8f" exitCode=0 Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.392231 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" event={"ID":"466a3f6b-d457-4d8f-9aa1-33332ebbb5da","Type":"ContainerDied","Data":"cd3a3dfd7fc59e402cad9adc2d3b1b4022e617b64dedf9e6dfe8c29e3cfdeb8f"} Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.432355 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.432335386 podStartE2EDuration="42.432335386s" podCreationTimestamp="2025-12-05 00:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:50:45.418952579 +0000 UTC m=+1664.634613529" watchObservedRunningTime="2025-12-05 00:50:45.432335386 +0000 UTC m=+1664.647996336" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.468899 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.468877509 podStartE2EDuration="42.468877509s" podCreationTimestamp="2025-12-05 00:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 00:50:45.44967247 +0000 UTC m=+1664.665333420" watchObservedRunningTime="2025-12-05 00:50:45.468877509 +0000 UTC m=+1664.684538449" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.493779 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.499070 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.562990 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563059 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563178 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563206 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwcch\" (UniqueName: \"kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563263 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563770 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7rrm\" (UniqueName: \"kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563812 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563841 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs\") pod \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\" (UID: \"466a3f6b-d457-4d8f-9aa1-33332ebbb5da\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563904 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.563969 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.564021 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.564064 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle\") pod \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\" (UID: \"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e\") " Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.579270 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm" (OuterVolumeSpecName: "kube-api-access-x7rrm") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "kube-api-access-x7rrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.588863 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch" (OuterVolumeSpecName: "kube-api-access-fwcch") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "kube-api-access-fwcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.591419 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.606549 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.667062 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwcch\" (UniqueName: \"kubernetes.io/projected/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-kube-api-access-fwcch\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.667093 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7rrm\" (UniqueName: \"kubernetes.io/projected/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-kube-api-access-x7rrm\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.667104 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.667113 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.687664 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data" (OuterVolumeSpecName: "config-data") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.694450 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.702409 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.743416 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.744607 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.768836 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.768864 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.768874 4759 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.768882 4759 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.768892 4759 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.802190 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" (UID: "ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.807707 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.811404 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data" (OuterVolumeSpecName: "config-data") pod "466a3f6b-d457-4d8f-9aa1-33332ebbb5da" (UID: "466a3f6b-d457-4d8f-9aa1-33332ebbb5da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.898987 4759 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.899015 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:45 crc kubenswrapper[4759]: I1205 00:50:45.899026 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a3f6b-d457-4d8f-9aa1-33332ebbb5da-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.414291 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-dc865dc89-pdskl" Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.415246 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-dc865dc89-pdskl" event={"ID":"ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e","Type":"ContainerDied","Data":"71f0dd746e11ea151ad232a364d7d88dd90f9d6f58f6d389f862ed81e81d6d2b"} Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.415328 4759 scope.go:117] "RemoveContainer" containerID="5edca824b8e8174871bb63690d372a2cbd59799f6282f923bdd89815c9b81820" Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.427127 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.428388 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f8fdfbc8b-gjmcr" event={"ID":"466a3f6b-d457-4d8f-9aa1-33332ebbb5da","Type":"ContainerDied","Data":"5156c8612a077d2cea82817c1842d32b3f5ce993672ffb0c1b40e9e2b579e0aa"} Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.497363 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.527360 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-dc865dc89-pdskl"] Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.639355 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:50:46 crc kubenswrapper[4759]: I1205 00:50:46.649405 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f8fdfbc8b-gjmcr"] Dec 05 00:50:47 crc kubenswrapper[4759]: I1205 00:50:47.198799 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" path="/var/lib/kubelet/pods/466a3f6b-d457-4d8f-9aa1-33332ebbb5da/volumes" Dec 05 00:50:47 crc kubenswrapper[4759]: I1205 00:50:47.199369 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" path="/var/lib/kubelet/pods/ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e/volumes" Dec 05 00:50:49 crc kubenswrapper[4759]: I1205 00:50:49.538145 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-85bff75774-kcbvj" Dec 05 00:50:49 crc kubenswrapper[4759]: I1205 00:50:49.606403 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:50:49 crc kubenswrapper[4759]: I1205 00:50:49.606717 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-59cf789667-jg8vv" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerName="heat-engine" containerID="cri-o://f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" gracePeriod=60 Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.156116 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:50:55 crc kubenswrapper[4759]: E1205 00:50:55.157044 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.533143 4759 scope.go:117] "RemoveContainer" containerID="cd3a3dfd7fc59e402cad9adc2d3b1b4022e617b64dedf9e6dfe8c29e3cfdeb8f" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.782004 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-5f58p"] Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.792758 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-5f58p"] Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.846872 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-jf2gl"] Dec 05 00:50:55 crc kubenswrapper[4759]: E1205 00:50:55.847744 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerName="heat-api" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.847766 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerName="heat-api" Dec 05 00:50:55 crc kubenswrapper[4759]: E1205 00:50:55.847789 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerName="heat-cfnapi" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.847796 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerName="heat-cfnapi" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.847989 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba777ca3-1fb4-4051-b0b6-ddc1cc5bcd5e" containerName="heat-cfnapi" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.848009 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="466a3f6b-d457-4d8f-9aa1-33332ebbb5da" containerName="heat-api" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.848738 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.853549 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 00:50:55 crc kubenswrapper[4759]: I1205 00:50:55.862777 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jf2gl"] Dec 05 00:50:56 crc kubenswrapper[4759]: E1205 00:50:56.022006 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:50:56 crc kubenswrapper[4759]: E1205 00:50:56.023712 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:50:56 crc kubenswrapper[4759]: E1205 00:50:56.030212 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 05 00:50:56 crc kubenswrapper[4759]: E1205 00:50:56.030274 4759 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-59cf789667-jg8vv" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerName="heat-engine" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.042586 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.042645 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.042691 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwr2s\" (UniqueName: \"kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.042995 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.144587 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.144743 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.144768 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.144815 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwr2s\" (UniqueName: \"kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.152885 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.152977 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.154539 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.180491 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwr2s\" (UniqueName: \"kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s\") pod \"aodh-db-sync-jf2gl\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.474402 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.569504 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" event={"ID":"4ec53225-5ccb-4be7-af07-c86a1931fea9","Type":"ContainerStarted","Data":"7752af2f399c656c6fece04afcefc5133f386037a09abd5c35d132bbd6dd43d3"} Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.626888 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" podStartSLOduration=2.549857624 podStartE2EDuration="17.626870193s" podCreationTimestamp="2025-12-05 00:50:39 +0000 UTC" firstStartedPulling="2025-12-05 00:50:40.55584794 +0000 UTC m=+1659.771508890" lastFinishedPulling="2025-12-05 00:50:55.632860509 +0000 UTC m=+1674.848521459" observedRunningTime="2025-12-05 00:50:56.626057054 +0000 UTC m=+1675.841718004" watchObservedRunningTime="2025-12-05 00:50:56.626870193 +0000 UTC m=+1675.842531143" Dec 05 00:50:56 crc kubenswrapper[4759]: I1205 00:50:56.794888 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 00:50:57 crc kubenswrapper[4759]: I1205 00:50:57.097942 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jf2gl"] Dec 05 00:50:57 crc kubenswrapper[4759]: I1205 00:50:57.172187 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf28a30c-49fe-4e6f-8684-eca499f44133" path="/var/lib/kubelet/pods/bf28a30c-49fe-4e6f-8684-eca499f44133/volumes" Dec 05 00:50:57 crc kubenswrapper[4759]: I1205 00:50:57.583173 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jf2gl" event={"ID":"d7c9a7af-b42a-4d69-93e1-78040788fe1b","Type":"ContainerStarted","Data":"86622128009f5159e047f7fa4492ee37348b9dd8f1a5a7d3a641568e207e2a10"} Dec 05 00:51:01 crc kubenswrapper[4759]: I1205 00:51:01.869485 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 00:51:02 crc kubenswrapper[4759]: I1205 00:51:02.641076 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jf2gl" event={"ID":"d7c9a7af-b42a-4d69-93e1-78040788fe1b","Type":"ContainerStarted","Data":"da348c22afd5b86083ff116e053f1c904b4c3ec815b586f7d49604d77593b075"} Dec 05 00:51:02 crc kubenswrapper[4759]: I1205 00:51:02.678055 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-jf2gl" podStartSLOduration=2.9279506360000003 podStartE2EDuration="7.678022196s" podCreationTimestamp="2025-12-05 00:50:55 +0000 UTC" firstStartedPulling="2025-12-05 00:50:57.115838224 +0000 UTC m=+1676.331499174" lastFinishedPulling="2025-12-05 00:51:01.865909784 +0000 UTC m=+1681.081570734" observedRunningTime="2025-12-05 00:51:02.659708459 +0000 UTC m=+1681.875369449" watchObservedRunningTime="2025-12-05 00:51:02.678022196 +0000 UTC m=+1681.893683186" Dec 05 00:51:03 crc kubenswrapper[4759]: I1205 00:51:03.933448 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 00:51:03 crc kubenswrapper[4759]: I1205 00:51:03.964446 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 00:51:04 crc kubenswrapper[4759]: I1205 00:51:04.667261 4759 generic.go:334] "Generic (PLEG): container finished" podID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerID="f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" exitCode=0 Dec 05 00:51:04 crc kubenswrapper[4759]: I1205 00:51:04.667343 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59cf789667-jg8vv" event={"ID":"8b2799ca-c468-46f0-8e2a-689e6d6bf81b","Type":"ContainerDied","Data":"f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da"} Dec 05 00:51:04 crc kubenswrapper[4759]: I1205 00:51:04.938799 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.046330 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data\") pod \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.046367 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle\") pod \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.046506 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mtm\" (UniqueName: \"kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm\") pod \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.046616 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom\") pod \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\" (UID: \"8b2799ca-c468-46f0-8e2a-689e6d6bf81b\") " Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.051857 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm" (OuterVolumeSpecName: "kube-api-access-x4mtm") pod "8b2799ca-c468-46f0-8e2a-689e6d6bf81b" (UID: "8b2799ca-c468-46f0-8e2a-689e6d6bf81b"). InnerVolumeSpecName "kube-api-access-x4mtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.052150 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b2799ca-c468-46f0-8e2a-689e6d6bf81b" (UID: "8b2799ca-c468-46f0-8e2a-689e6d6bf81b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.084216 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b2799ca-c468-46f0-8e2a-689e6d6bf81b" (UID: "8b2799ca-c468-46f0-8e2a-689e6d6bf81b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.122049 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data" (OuterVolumeSpecName: "config-data") pod "8b2799ca-c468-46f0-8e2a-689e6d6bf81b" (UID: "8b2799ca-c468-46f0-8e2a-689e6d6bf81b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.149415 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4mtm\" (UniqueName: \"kubernetes.io/projected/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-kube-api-access-x4mtm\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.149449 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.149458 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.149469 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2799ca-c468-46f0-8e2a-689e6d6bf81b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.684480 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-59cf789667-jg8vv" event={"ID":"8b2799ca-c468-46f0-8e2a-689e6d6bf81b","Type":"ContainerDied","Data":"bcc525d0720f203e0370dc208caf2b56cc1aaf6fdf0a0087b0cc8b623b25561b"} Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.684844 4759 scope.go:117] "RemoveContainer" containerID="f7d711493ef317366d30ce83014e5ca680ee182004e79d33a13bd627992b29da" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.684486 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-59cf789667-jg8vv" Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.694775 4759 generic.go:334] "Generic (PLEG): container finished" podID="d7c9a7af-b42a-4d69-93e1-78040788fe1b" containerID="da348c22afd5b86083ff116e053f1c904b4c3ec815b586f7d49604d77593b075" exitCode=0 Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.694819 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jf2gl" event={"ID":"d7c9a7af-b42a-4d69-93e1-78040788fe1b","Type":"ContainerDied","Data":"da348c22afd5b86083ff116e053f1c904b4c3ec815b586f7d49604d77593b075"} Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.711553 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:51:05 crc kubenswrapper[4759]: I1205 00:51:05.721066 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-59cf789667-jg8vv"] Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.150902 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.179733 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" path="/var/lib/kubelet/pods/8b2799ca-c468-46f0-8e2a-689e6d6bf81b/volumes" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.298628 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle\") pod \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.298806 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwr2s\" (UniqueName: \"kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s\") pod \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.299130 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data\") pod \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.299397 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts\") pod \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\" (UID: \"d7c9a7af-b42a-4d69-93e1-78040788fe1b\") " Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.304754 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s" (OuterVolumeSpecName: "kube-api-access-mwr2s") pod "d7c9a7af-b42a-4d69-93e1-78040788fe1b" (UID: "d7c9a7af-b42a-4d69-93e1-78040788fe1b"). InnerVolumeSpecName "kube-api-access-mwr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.305247 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts" (OuterVolumeSpecName: "scripts") pod "d7c9a7af-b42a-4d69-93e1-78040788fe1b" (UID: "d7c9a7af-b42a-4d69-93e1-78040788fe1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.336820 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7c9a7af-b42a-4d69-93e1-78040788fe1b" (UID: "d7c9a7af-b42a-4d69-93e1-78040788fe1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.344708 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data" (OuterVolumeSpecName: "config-data") pod "d7c9a7af-b42a-4d69-93e1-78040788fe1b" (UID: "d7c9a7af-b42a-4d69-93e1-78040788fe1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.403240 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.403270 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.403282 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwr2s\" (UniqueName: \"kubernetes.io/projected/d7c9a7af-b42a-4d69-93e1-78040788fe1b-kube-api-access-mwr2s\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.403290 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c9a7af-b42a-4d69-93e1-78040788fe1b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.726246 4759 generic.go:334] "Generic (PLEG): container finished" podID="4ec53225-5ccb-4be7-af07-c86a1931fea9" containerID="7752af2f399c656c6fece04afcefc5133f386037a09abd5c35d132bbd6dd43d3" exitCode=0 Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.726290 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" event={"ID":"4ec53225-5ccb-4be7-af07-c86a1931fea9","Type":"ContainerDied","Data":"7752af2f399c656c6fece04afcefc5133f386037a09abd5c35d132bbd6dd43d3"} Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.729548 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jf2gl" event={"ID":"d7c9a7af-b42a-4d69-93e1-78040788fe1b","Type":"ContainerDied","Data":"86622128009f5159e047f7fa4492ee37348b9dd8f1a5a7d3a641568e207e2a10"} Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.729605 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86622128009f5159e047f7fa4492ee37348b9dd8f1a5a7d3a641568e207e2a10" Dec 05 00:51:07 crc kubenswrapper[4759]: I1205 00:51:07.729615 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jf2gl" Dec 05 00:51:08 crc kubenswrapper[4759]: I1205 00:51:08.156204 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:51:08 crc kubenswrapper[4759]: E1205 00:51:08.156843 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.305151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.456874 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key\") pod \"4ec53225-5ccb-4be7-af07-c86a1931fea9\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.456966 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle\") pod \"4ec53225-5ccb-4be7-af07-c86a1931fea9\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.457048 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phbm4\" (UniqueName: \"kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4\") pod \"4ec53225-5ccb-4be7-af07-c86a1931fea9\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.457167 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory\") pod \"4ec53225-5ccb-4be7-af07-c86a1931fea9\" (UID: \"4ec53225-5ccb-4be7-af07-c86a1931fea9\") " Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.462807 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4" (OuterVolumeSpecName: "kube-api-access-phbm4") pod "4ec53225-5ccb-4be7-af07-c86a1931fea9" (UID: "4ec53225-5ccb-4be7-af07-c86a1931fea9"). InnerVolumeSpecName "kube-api-access-phbm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.464465 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4ec53225-5ccb-4be7-af07-c86a1931fea9" (UID: "4ec53225-5ccb-4be7-af07-c86a1931fea9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.491218 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory" (OuterVolumeSpecName: "inventory") pod "4ec53225-5ccb-4be7-af07-c86a1931fea9" (UID: "4ec53225-5ccb-4be7-af07-c86a1931fea9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.495097 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4ec53225-5ccb-4be7-af07-c86a1931fea9" (UID: "4ec53225-5ccb-4be7-af07-c86a1931fea9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.559793 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.559824 4759 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.559836 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phbm4\" (UniqueName: \"kubernetes.io/projected/4ec53225-5ccb-4be7-af07-c86a1931fea9-kube-api-access-phbm4\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.559845 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ec53225-5ccb-4be7-af07-c86a1931fea9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.756391 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" event={"ID":"4ec53225-5ccb-4be7-af07-c86a1931fea9","Type":"ContainerDied","Data":"a82163d7c02fa9abd8903b17ab05d0fca8225757f3cb22c9fbf25a7531c19131"} Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.756643 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a82163d7c02fa9abd8903b17ab05d0fca8225757f3cb22c9fbf25a7531c19131" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.756495 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.859912 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf"] Dec 05 00:51:09 crc kubenswrapper[4759]: E1205 00:51:09.860282 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c9a7af-b42a-4d69-93e1-78040788fe1b" containerName="aodh-db-sync" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860305 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c9a7af-b42a-4d69-93e1-78040788fe1b" containerName="aodh-db-sync" Dec 05 00:51:09 crc kubenswrapper[4759]: E1205 00:51:09.860351 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerName="heat-engine" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860358 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerName="heat-engine" Dec 05 00:51:09 crc kubenswrapper[4759]: E1205 00:51:09.860378 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec53225-5ccb-4be7-af07-c86a1931fea9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860386 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec53225-5ccb-4be7-af07-c86a1931fea9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860571 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2799ca-c468-46f0-8e2a-689e6d6bf81b" containerName="heat-engine" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860585 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec53225-5ccb-4be7-af07-c86a1931fea9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.860608 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c9a7af-b42a-4d69-93e1-78040788fe1b" containerName="aodh-db-sync" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.861250 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.863900 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.865336 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.865654 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.865800 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.888116 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf"] Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.966457 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.966596 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kvl\" (UniqueName: \"kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.966743 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:09 crc kubenswrapper[4759]: I1205 00:51:09.966775 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.068590 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.068767 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kvl\" (UniqueName: \"kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.068845 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.068875 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.086794 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.086938 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.086989 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.090011 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kvl\" (UniqueName: \"kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.183724 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.838501 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf"] Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.883057 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.883356 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-api" containerID="cri-o://2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1" gracePeriod=30 Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.883477 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-notifier" containerID="cri-o://badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87" gracePeriod=30 Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.883484 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-listener" containerID="cri-o://559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a" gracePeriod=30 Dec 05 00:51:10 crc kubenswrapper[4759]: I1205 00:51:10.883510 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-evaluator" containerID="cri-o://1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135" gracePeriod=30 Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.777213 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" event={"ID":"92c0abf7-a6cc-411f-bccc-778819b2370d","Type":"ContainerStarted","Data":"b58828a728d123d6832ce7f98ab3312299d3d9ff44fc5d3e05f86eb3909ecd69"} Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.777749 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" event={"ID":"92c0abf7-a6cc-411f-bccc-778819b2370d","Type":"ContainerStarted","Data":"43c792178a80412e77dfdddc1e47ba93ed81f0d8f3348c2a48e76222425864a1"} Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.781101 4759 generic.go:334] "Generic (PLEG): container finished" podID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerID="1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135" exitCode=0 Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.781147 4759 generic.go:334] "Generic (PLEG): container finished" podID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerID="2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1" exitCode=0 Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.781183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerDied","Data":"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135"} Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.781226 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerDied","Data":"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1"} Dec 05 00:51:11 crc kubenswrapper[4759]: I1205 00:51:11.802590 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" podStartSLOduration=2.377017586 podStartE2EDuration="2.802570222s" podCreationTimestamp="2025-12-05 00:51:09 +0000 UTC" firstStartedPulling="2025-12-05 00:51:10.842173424 +0000 UTC m=+1690.057834374" lastFinishedPulling="2025-12-05 00:51:11.26772606 +0000 UTC m=+1690.483387010" observedRunningTime="2025-12-05 00:51:11.793395319 +0000 UTC m=+1691.009056299" watchObservedRunningTime="2025-12-05 00:51:11.802570222 +0000 UTC m=+1691.018231172" Dec 05 00:51:12 crc kubenswrapper[4759]: I1205 00:51:12.794483 4759 generic.go:334] "Generic (PLEG): container finished" podID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerID="badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87" exitCode=0 Dec 05 00:51:12 crc kubenswrapper[4759]: I1205 00:51:12.794582 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerDied","Data":"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87"} Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.492428 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.585979 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.586067 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml82s\" (UniqueName: \"kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.586147 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.586195 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.586218 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.586364 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs\") pod \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\" (UID: \"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277\") " Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.606225 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s" (OuterVolumeSpecName: "kube-api-access-ml82s") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "kube-api-access-ml82s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.606508 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts" (OuterVolumeSpecName: "scripts") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.694369 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.694422 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml82s\" (UniqueName: \"kubernetes.io/projected/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-kube-api-access-ml82s\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.694862 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.728869 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.771782 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.772467 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data" (OuterVolumeSpecName: "config-data") pod "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" (UID: "5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.796370 4759 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.796404 4759 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.796418 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.796432 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.835585 4759 generic.go:334] "Generic (PLEG): container finished" podID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerID="559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a" exitCode=0 Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.835632 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerDied","Data":"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a"} Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.835671 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277","Type":"ContainerDied","Data":"09aad32a0d578c0a4029c2cbfaea9488314152a21f06ece3396f07f840fd2d78"} Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.835692 4759 scope.go:117] "RemoveContainer" containerID="559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.835717 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.877601 4759 scope.go:117] "RemoveContainer" containerID="badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.907994 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.925090 4759 scope.go:117] "RemoveContainer" containerID="1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.929870 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.939521 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:15 crc kubenswrapper[4759]: E1205 00:51:15.939979 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-api" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940002 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-api" Dec 05 00:51:15 crc kubenswrapper[4759]: E1205 00:51:15.940015 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-listener" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940022 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-listener" Dec 05 00:51:15 crc kubenswrapper[4759]: E1205 00:51:15.940033 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-evaluator" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940040 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-evaluator" Dec 05 00:51:15 crc kubenswrapper[4759]: E1205 00:51:15.940052 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-notifier" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940058 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-notifier" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940266 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-api" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940291 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-notifier" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940299 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-listener" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.940327 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" containerName="aodh-evaluator" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.942284 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.944958 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.948272 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.948575 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.948697 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.949377 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mr6x2" Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.949919 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:15 crc kubenswrapper[4759]: I1205 00:51:15.976507 4759 scope.go:117] "RemoveContainer" containerID="2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.006541 4759 scope.go:117] "RemoveContainer" containerID="559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a" Dec 05 00:51:16 crc kubenswrapper[4759]: E1205 00:51:16.007050 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a\": container with ID starting with 559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a not found: ID does not exist" containerID="559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007091 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a"} err="failed to get container status \"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a\": rpc error: code = NotFound desc = could not find container \"559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a\": container with ID starting with 559344d467c606555e061ea1e651ec2a9e7c46c3808554d768b99f675752250a not found: ID does not exist" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007120 4759 scope.go:117] "RemoveContainer" containerID="badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87" Dec 05 00:51:16 crc kubenswrapper[4759]: E1205 00:51:16.007425 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87\": container with ID starting with badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87 not found: ID does not exist" containerID="badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007465 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87"} err="failed to get container status \"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87\": rpc error: code = NotFound desc = could not find container \"badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87\": container with ID starting with badc4694ab27428ed4aef88e258de28ef52d7dcdfbcb62c21ede154715d41f87 not found: ID does not exist" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007496 4759 scope.go:117] "RemoveContainer" containerID="1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135" Dec 05 00:51:16 crc kubenswrapper[4759]: E1205 00:51:16.007779 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135\": container with ID starting with 1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135 not found: ID does not exist" containerID="1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007804 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135"} err="failed to get container status \"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135\": rpc error: code = NotFound desc = could not find container \"1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135\": container with ID starting with 1168391374f3cbfc1a820a4150bd7501481dd78f7537bb7baaa1c4982b183135 not found: ID does not exist" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.007821 4759 scope.go:117] "RemoveContainer" containerID="2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1" Dec 05 00:51:16 crc kubenswrapper[4759]: E1205 00:51:16.008018 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1\": container with ID starting with 2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1 not found: ID does not exist" containerID="2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.008040 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1"} err="failed to get container status \"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1\": rpc error: code = NotFound desc = could not find container \"2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1\": container with ID starting with 2f5c542da8b24d3829a5ccbe6bf34939106dfe2db112118d75e2a056d5ab17a1 not found: ID does not exist" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.101924 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-internal-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.102138 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-scripts\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.102221 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-public-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.102338 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.102469 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-config-data\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.102586 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxh2l\" (UniqueName: \"kubernetes.io/projected/dab6929f-0e72-4f06-84cd-c3db7967578f-kube-api-access-wxh2l\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.204535 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.204815 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-config-data\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.204931 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxh2l\" (UniqueName: \"kubernetes.io/projected/dab6929f-0e72-4f06-84cd-c3db7967578f-kube-api-access-wxh2l\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.205133 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-internal-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.205228 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-scripts\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.205360 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-public-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.208629 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-internal-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.208754 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-public-tls-certs\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.209665 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.210590 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-config-data\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.211264 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dab6929f-0e72-4f06-84cd-c3db7967578f-scripts\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.225816 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxh2l\" (UniqueName: \"kubernetes.io/projected/dab6929f-0e72-4f06-84cd-c3db7967578f-kube-api-access-wxh2l\") pod \"aodh-0\" (UID: \"dab6929f-0e72-4f06-84cd-c3db7967578f\") " pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.274719 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 05 00:51:16 crc kubenswrapper[4759]: W1205 00:51:16.790693 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab6929f_0e72_4f06_84cd_c3db7967578f.slice/crio-6662c88f69c082806a602f7a2e146addd566ae306b86c5d02d00d67425728b62 WatchSource:0}: Error finding container 6662c88f69c082806a602f7a2e146addd566ae306b86c5d02d00d67425728b62: Status 404 returned error can't find the container with id 6662c88f69c082806a602f7a2e146addd566ae306b86c5d02d00d67425728b62 Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.799748 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 05 00:51:16 crc kubenswrapper[4759]: I1205 00:51:16.851851 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dab6929f-0e72-4f06-84cd-c3db7967578f","Type":"ContainerStarted","Data":"6662c88f69c082806a602f7a2e146addd566ae306b86c5d02d00d67425728b62"} Dec 05 00:51:17 crc kubenswrapper[4759]: I1205 00:51:17.170964 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277" path="/var/lib/kubelet/pods/5b51fe2f-45ad-4c6a-a4ca-fdc8b8eea277/volumes" Dec 05 00:51:17 crc kubenswrapper[4759]: I1205 00:51:17.864144 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dab6929f-0e72-4f06-84cd-c3db7967578f","Type":"ContainerStarted","Data":"2f140d9d0dc3d1baeac858bb63d92187962bd0a75729ee3a3c06e1a361f171b8"} Dec 05 00:51:18 crc kubenswrapper[4759]: I1205 00:51:18.880791 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dab6929f-0e72-4f06-84cd-c3db7967578f","Type":"ContainerStarted","Data":"d8a7edeacb46e1ca1bd4ddee5f966eca092833a679453788d535fa0cb0b4afbe"} Dec 05 00:51:19 crc kubenswrapper[4759]: I1205 00:51:19.904442 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dab6929f-0e72-4f06-84cd-c3db7967578f","Type":"ContainerStarted","Data":"f1e2fdb6e0a36ce6b0195c04a2dee06fea824918c651660fa1e25eeef03d5a08"} Dec 05 00:51:20 crc kubenswrapper[4759]: I1205 00:51:20.918020 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"dab6929f-0e72-4f06-84cd-c3db7967578f","Type":"ContainerStarted","Data":"5299e5532682b2fddbccd7be3d3f40c5bf912dec837515e778f74eb8f1b13ea1"} Dec 05 00:51:20 crc kubenswrapper[4759]: I1205 00:51:20.942694 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.098013469 podStartE2EDuration="5.942674216s" podCreationTimestamp="2025-12-05 00:51:15 +0000 UTC" firstStartedPulling="2025-12-05 00:51:16.794756916 +0000 UTC m=+1696.010417866" lastFinishedPulling="2025-12-05 00:51:20.639417653 +0000 UTC m=+1699.855078613" observedRunningTime="2025-12-05 00:51:20.93665635 +0000 UTC m=+1700.152317300" watchObservedRunningTime="2025-12-05 00:51:20.942674216 +0000 UTC m=+1700.158335166" Dec 05 00:51:23 crc kubenswrapper[4759]: I1205 00:51:23.156686 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:51:23 crc kubenswrapper[4759]: E1205 00:51:23.157274 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:51:25 crc kubenswrapper[4759]: I1205 00:51:25.254177 4759 scope.go:117] "RemoveContainer" containerID="49888a093825543ae1d84c4d182ad7b4b8480f8bc9ad71fa039380f503c1de8b" Dec 05 00:51:25 crc kubenswrapper[4759]: I1205 00:51:25.307867 4759 scope.go:117] "RemoveContainer" containerID="aa8da572cf07e326f227a0ee1325d03405affadfadde5b8c194e6003455db497" Dec 05 00:51:35 crc kubenswrapper[4759]: I1205 00:51:35.156287 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:51:35 crc kubenswrapper[4759]: E1205 00:51:35.157164 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:51:48 crc kubenswrapper[4759]: I1205 00:51:48.156661 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:51:48 crc kubenswrapper[4759]: E1205 00:51:48.157607 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:52:03 crc kubenswrapper[4759]: I1205 00:52:03.155866 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:52:03 crc kubenswrapper[4759]: E1205 00:52:03.156811 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.238227 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.246220 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.276779 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.377881 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.377964 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlx4v\" (UniqueName: \"kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.379438 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.482299 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.482384 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlx4v\" (UniqueName: \"kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.482484 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.483056 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.483613 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.505548 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlx4v\" (UniqueName: \"kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v\") pod \"redhat-operators-7xvhw\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:14 crc kubenswrapper[4759]: I1205 00:52:14.584583 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:15 crc kubenswrapper[4759]: I1205 00:52:15.088659 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:15 crc kubenswrapper[4759]: I1205 00:52:15.159448 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:52:15 crc kubenswrapper[4759]: E1205 00:52:15.160156 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:52:15 crc kubenswrapper[4759]: I1205 00:52:15.302555 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerStarted","Data":"b9aa8f4a3c05c03228b193ac2e280af6df974179a04cc94a006e549fb95b0157"} Dec 05 00:52:16 crc kubenswrapper[4759]: I1205 00:52:16.317651 4759 generic.go:334] "Generic (PLEG): container finished" podID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerID="9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a" exitCode=0 Dec 05 00:52:16 crc kubenswrapper[4759]: I1205 00:52:16.317718 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerDied","Data":"9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a"} Dec 05 00:52:17 crc kubenswrapper[4759]: I1205 00:52:17.334058 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerStarted","Data":"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c"} Dec 05 00:52:21 crc kubenswrapper[4759]: I1205 00:52:21.399805 4759 generic.go:334] "Generic (PLEG): container finished" podID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerID="2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c" exitCode=0 Dec 05 00:52:21 crc kubenswrapper[4759]: I1205 00:52:21.399901 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerDied","Data":"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c"} Dec 05 00:52:22 crc kubenswrapper[4759]: I1205 00:52:22.414808 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerStarted","Data":"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281"} Dec 05 00:52:22 crc kubenswrapper[4759]: I1205 00:52:22.456184 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7xvhw" podStartSLOduration=2.845965493 podStartE2EDuration="8.456152942s" podCreationTimestamp="2025-12-05 00:52:14 +0000 UTC" firstStartedPulling="2025-12-05 00:52:16.319657268 +0000 UTC m=+1755.535318238" lastFinishedPulling="2025-12-05 00:52:21.929844697 +0000 UTC m=+1761.145505687" observedRunningTime="2025-12-05 00:52:22.43629199 +0000 UTC m=+1761.651952940" watchObservedRunningTime="2025-12-05 00:52:22.456152942 +0000 UTC m=+1761.671813912" Dec 05 00:52:24 crc kubenswrapper[4759]: I1205 00:52:24.585764 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:24 crc kubenswrapper[4759]: I1205 00:52:24.586152 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:25 crc kubenswrapper[4759]: I1205 00:52:25.540234 4759 scope.go:117] "RemoveContainer" containerID="cec3ad6a10e6c07bb5dc6c263fe2871b09d752d55b9b4389d1b9211358ac3d53" Dec 05 00:52:25 crc kubenswrapper[4759]: I1205 00:52:25.682489 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7xvhw" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="registry-server" probeResult="failure" output=< Dec 05 00:52:25 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 00:52:25 crc kubenswrapper[4759]: > Dec 05 00:52:26 crc kubenswrapper[4759]: I1205 00:52:26.156349 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:52:26 crc kubenswrapper[4759]: E1205 00:52:26.157046 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:52:34 crc kubenswrapper[4759]: I1205 00:52:34.680504 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:34 crc kubenswrapper[4759]: I1205 00:52:34.793228 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:35 crc kubenswrapper[4759]: I1205 00:52:34.999689 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:36 crc kubenswrapper[4759]: I1205 00:52:36.660844 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7xvhw" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="registry-server" containerID="cri-o://4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281" gracePeriod=2 Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.166873 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.206729 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content\") pod \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.207205 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlx4v\" (UniqueName: \"kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v\") pod \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.207482 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities\") pod \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\" (UID: \"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b\") " Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.246775 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities" (OuterVolumeSpecName: "utilities") pod "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" (UID: "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.254533 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v" (OuterVolumeSpecName: "kube-api-access-xlx4v") pod "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" (UID: "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b"). InnerVolumeSpecName "kube-api-access-xlx4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.310603 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.310633 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlx4v\" (UniqueName: \"kubernetes.io/projected/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-kube-api-access-xlx4v\") on node \"crc\" DevicePath \"\"" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.346118 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" (UID: "da6308a7-1ac0-4dcd-b2e2-70ab99453c1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.412040 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.679911 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xvhw" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.679889 4759 generic.go:334] "Generic (PLEG): container finished" podID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerID="4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281" exitCode=0 Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.679923 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerDied","Data":"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281"} Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.680103 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xvhw" event={"ID":"da6308a7-1ac0-4dcd-b2e2-70ab99453c1b","Type":"ContainerDied","Data":"b9aa8f4a3c05c03228b193ac2e280af6df974179a04cc94a006e549fb95b0157"} Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.680141 4759 scope.go:117] "RemoveContainer" containerID="4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.736206 4759 scope.go:117] "RemoveContainer" containerID="2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.754103 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.767250 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7xvhw"] Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.783982 4759 scope.go:117] "RemoveContainer" containerID="9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.831646 4759 scope.go:117] "RemoveContainer" containerID="4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281" Dec 05 00:52:37 crc kubenswrapper[4759]: E1205 00:52:37.832146 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281\": container with ID starting with 4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281 not found: ID does not exist" containerID="4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.832203 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281"} err="failed to get container status \"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281\": rpc error: code = NotFound desc = could not find container \"4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281\": container with ID starting with 4cb94fc0c9e20808f7bff25904ffb9aa4e9bbbbddc017a7a4bf2aaf5c52a2281 not found: ID does not exist" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.832233 4759 scope.go:117] "RemoveContainer" containerID="2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c" Dec 05 00:52:37 crc kubenswrapper[4759]: E1205 00:52:37.832671 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c\": container with ID starting with 2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c not found: ID does not exist" containerID="2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.832708 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c"} err="failed to get container status \"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c\": rpc error: code = NotFound desc = could not find container \"2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c\": container with ID starting with 2131be6abb3296f816b4e435150967b4841917339a6b3ec9be1884f4f42e615c not found: ID does not exist" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.832734 4759 scope.go:117] "RemoveContainer" containerID="9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a" Dec 05 00:52:37 crc kubenswrapper[4759]: E1205 00:52:37.833230 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a\": container with ID starting with 9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a not found: ID does not exist" containerID="9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a" Dec 05 00:52:37 crc kubenswrapper[4759]: I1205 00:52:37.833279 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a"} err="failed to get container status \"9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a\": rpc error: code = NotFound desc = could not find container \"9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a\": container with ID starting with 9e8e88b668f5678793c3032c27cf670b1d9fbc3a65c0d3839d528c24e2c5840a not found: ID does not exist" Dec 05 00:52:39 crc kubenswrapper[4759]: I1205 00:52:39.156071 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:52:39 crc kubenswrapper[4759]: E1205 00:52:39.156870 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:52:39 crc kubenswrapper[4759]: I1205 00:52:39.176583 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" path="/var/lib/kubelet/pods/da6308a7-1ac0-4dcd-b2e2-70ab99453c1b/volumes" Dec 05 00:52:51 crc kubenswrapper[4759]: I1205 00:52:51.185338 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:52:51 crc kubenswrapper[4759]: E1205 00:52:51.186115 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:53:05 crc kubenswrapper[4759]: I1205 00:53:05.155928 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:53:06 crc kubenswrapper[4759]: I1205 00:53:06.105098 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8"} Dec 05 00:53:25 crc kubenswrapper[4759]: I1205 00:53:25.664443 4759 scope.go:117] "RemoveContainer" containerID="186415a7ed2030aad90059c2d1e7bd3ce30f31ebfad7a40ddde9f8d3881cb4d5" Dec 05 00:53:25 crc kubenswrapper[4759]: I1205 00:53:25.711228 4759 scope.go:117] "RemoveContainer" containerID="f0080ba3f01b33a6e53a87494e4ada708db52a9e16f2c01af9cbf29c650c23b1" Dec 05 00:54:25 crc kubenswrapper[4759]: I1205 00:54:25.879837 4759 scope.go:117] "RemoveContainer" containerID="37efa3277bf835ca8e9c882acc6c888664033213e07c04b2f8e629679366ef8c" Dec 05 00:54:25 crc kubenswrapper[4759]: I1205 00:54:25.922710 4759 scope.go:117] "RemoveContainer" containerID="eba0d7a4f2df32e5ff9452fcc0e83e1591ebd907fe3b06891b0091c1ffc6960f" Dec 05 00:54:25 crc kubenswrapper[4759]: I1205 00:54:25.953661 4759 scope.go:117] "RemoveContainer" containerID="9bca42c9fbcb33a3a1562e999f2308ae4f0ac371dc7da7e17be6d6af558ed7c0" Dec 05 00:54:25 crc kubenswrapper[4759]: I1205 00:54:25.998158 4759 scope.go:117] "RemoveContainer" containerID="42d72ec345784c9efad3ab9637081e0ae1af2b0c362d1e0be19903481debfa7a" Dec 05 00:54:26 crc kubenswrapper[4759]: I1205 00:54:26.027043 4759 scope.go:117] "RemoveContainer" containerID="045cbf82ddbb75a9caf5c5a8a1ee7cb4b38e18c7610bf0dc8539e4a6be98f56c" Dec 05 00:54:26 crc kubenswrapper[4759]: I1205 00:54:26.053235 4759 scope.go:117] "RemoveContainer" containerID="3700d31f2a7981c9d7d9609971c4985fab2655b62afbd25b16860029e3190c83" Dec 05 00:54:26 crc kubenswrapper[4759]: I1205 00:54:26.114130 4759 scope.go:117] "RemoveContainer" containerID="c6c20032a68e4777e37abb6e04d9ce16aba62e8b83af9c129f0017656c2878ca" Dec 05 00:54:26 crc kubenswrapper[4759]: I1205 00:54:26.145009 4759 scope.go:117] "RemoveContainer" containerID="6c87b322114ef2228e55573278aa7b48d5e29514390c44f635991df47cae5c1b" Dec 05 00:54:26 crc kubenswrapper[4759]: I1205 00:54:26.172565 4759 scope.go:117] "RemoveContainer" containerID="4fa30eb77d4842983a53fd0ebfd1affd4a3be6d64b7096531352a7a6ec58bed8" Dec 05 00:54:29 crc kubenswrapper[4759]: I1205 00:54:29.451382 4759 generic.go:334] "Generic (PLEG): container finished" podID="92c0abf7-a6cc-411f-bccc-778819b2370d" containerID="b58828a728d123d6832ce7f98ab3312299d3d9ff44fc5d3e05f86eb3909ecd69" exitCode=0 Dec 05 00:54:29 crc kubenswrapper[4759]: I1205 00:54:29.451614 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" event={"ID":"92c0abf7-a6cc-411f-bccc-778819b2370d","Type":"ContainerDied","Data":"b58828a728d123d6832ce7f98ab3312299d3d9ff44fc5d3e05f86eb3909ecd69"} Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.119572 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.237171 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory\") pod \"92c0abf7-a6cc-411f-bccc-778819b2370d\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.237440 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle\") pod \"92c0abf7-a6cc-411f-bccc-778819b2370d\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.237518 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key\") pod \"92c0abf7-a6cc-411f-bccc-778819b2370d\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.237673 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8kvl\" (UniqueName: \"kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl\") pod \"92c0abf7-a6cc-411f-bccc-778819b2370d\" (UID: \"92c0abf7-a6cc-411f-bccc-778819b2370d\") " Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.243247 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "92c0abf7-a6cc-411f-bccc-778819b2370d" (UID: "92c0abf7-a6cc-411f-bccc-778819b2370d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.252512 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl" (OuterVolumeSpecName: "kube-api-access-g8kvl") pod "92c0abf7-a6cc-411f-bccc-778819b2370d" (UID: "92c0abf7-a6cc-411f-bccc-778819b2370d"). InnerVolumeSpecName "kube-api-access-g8kvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.285436 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory" (OuterVolumeSpecName: "inventory") pod "92c0abf7-a6cc-411f-bccc-778819b2370d" (UID: "92c0abf7-a6cc-411f-bccc-778819b2370d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.288248 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "92c0abf7-a6cc-411f-bccc-778819b2370d" (UID: "92c0abf7-a6cc-411f-bccc-778819b2370d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.340375 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.340423 4759 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.340441 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92c0abf7-a6cc-411f-bccc-778819b2370d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.340490 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8kvl\" (UniqueName: \"kubernetes.io/projected/92c0abf7-a6cc-411f-bccc-778819b2370d-kube-api-access-g8kvl\") on node \"crc\" DevicePath \"\"" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.480856 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" event={"ID":"92c0abf7-a6cc-411f-bccc-778819b2370d","Type":"ContainerDied","Data":"43c792178a80412e77dfdddc1e47ba93ed81f0d8f3348c2a48e76222425864a1"} Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.480959 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.481762 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c792178a80412e77dfdddc1e47ba93ed81f0d8f3348c2a48e76222425864a1" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.605111 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5"] Dec 05 00:54:31 crc kubenswrapper[4759]: E1205 00:54:31.605860 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="extract-content" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.605893 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="extract-content" Dec 05 00:54:31 crc kubenswrapper[4759]: E1205 00:54:31.605941 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c0abf7-a6cc-411f-bccc-778819b2370d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.605956 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c0abf7-a6cc-411f-bccc-778819b2370d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 00:54:31 crc kubenswrapper[4759]: E1205 00:54:31.605983 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="registry-server" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.605995 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="registry-server" Dec 05 00:54:31 crc kubenswrapper[4759]: E1205 00:54:31.606019 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="extract-utilities" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.606032 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="extract-utilities" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.606490 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6308a7-1ac0-4dcd-b2e2-70ab99453c1b" containerName="registry-server" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.606548 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c0abf7-a6cc-411f-bccc-778819b2370d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.607813 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.609931 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.610548 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.610937 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.612044 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.633497 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5"] Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.750710 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.750931 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.751004 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr94r\" (UniqueName: \"kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.852964 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.853052 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr94r\" (UniqueName: \"kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.853189 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.858618 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.859258 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.875749 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr94r\" (UniqueName: \"kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:31 crc kubenswrapper[4759]: I1205 00:54:31.938566 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:54:32 crc kubenswrapper[4759]: I1205 00:54:32.514110 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5"] Dec 05 00:54:33 crc kubenswrapper[4759]: I1205 00:54:33.503138 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" event={"ID":"95f4137c-2b87-401f-ba79-56befbbd9757","Type":"ContainerStarted","Data":"97c717893f976cc3ff54ad450decc2c75796424effb83e3348c113847f657353"} Dec 05 00:54:33 crc kubenswrapper[4759]: I1205 00:54:33.503444 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" event={"ID":"95f4137c-2b87-401f-ba79-56befbbd9757","Type":"ContainerStarted","Data":"0eb4409342c6acb70b7b5cb7a10106976c545f8aa713d24ada4c73f366d86586"} Dec 05 00:54:33 crc kubenswrapper[4759]: I1205 00:54:33.527942 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" podStartSLOduration=2.071561853 podStartE2EDuration="2.527923208s" podCreationTimestamp="2025-12-05 00:54:31 +0000 UTC" firstStartedPulling="2025-12-05 00:54:32.524416222 +0000 UTC m=+1891.740077172" lastFinishedPulling="2025-12-05 00:54:32.980777537 +0000 UTC m=+1892.196438527" observedRunningTime="2025-12-05 00:54:33.523545882 +0000 UTC m=+1892.739206852" watchObservedRunningTime="2025-12-05 00:54:33.527923208 +0000 UTC m=+1892.743584168" Dec 05 00:55:13 crc kubenswrapper[4759]: I1205 00:55:13.081336 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7cmgd"] Dec 05 00:55:13 crc kubenswrapper[4759]: I1205 00:55:13.099006 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7cmgd"] Dec 05 00:55:13 crc kubenswrapper[4759]: I1205 00:55:13.178652 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2308f19c-fca8-4c34-9324-c063a3f03433" path="/var/lib/kubelet/pods/2308f19c-fca8-4c34-9324-c063a3f03433/volumes" Dec 05 00:55:14 crc kubenswrapper[4759]: I1205 00:55:14.048089 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-08d7-account-create-update-hk8fj"] Dec 05 00:55:14 crc kubenswrapper[4759]: I1205 00:55:14.069531 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-08d7-account-create-update-hk8fj"] Dec 05 00:55:15 crc kubenswrapper[4759]: I1205 00:55:15.182598 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6764ecf-06a5-4456-b58f-5bf5625e56f0" path="/var/lib/kubelet/pods/e6764ecf-06a5-4456-b58f-5bf5625e56f0/volumes" Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.056625 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-8fe5-account-create-update-bn9md"] Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.076581 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv"] Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.090412 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-8fe5-account-create-update-bn9md"] Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.103093 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-xp7tv"] Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.177667 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca571ecf-ba87-47b0-acb6-a76d034f0b32" path="/var/lib/kubelet/pods/ca571ecf-ba87-47b0-acb6-a76d034f0b32/volumes" Dec 05 00:55:19 crc kubenswrapper[4759]: I1205 00:55:19.179225 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7" path="/var/lib/kubelet/pods/cf7d1943-eee7-4d3c-bd3a-3d8e41f0fee7/volumes" Dec 05 00:55:22 crc kubenswrapper[4759]: I1205 00:55:22.061684 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xshzj"] Dec 05 00:55:22 crc kubenswrapper[4759]: I1205 00:55:22.083149 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xshzj"] Dec 05 00:55:23 crc kubenswrapper[4759]: I1205 00:55:23.046844 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-456f-account-create-update-2fx6n"] Dec 05 00:55:23 crc kubenswrapper[4759]: I1205 00:55:23.063582 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-456f-account-create-update-2fx6n"] Dec 05 00:55:23 crc kubenswrapper[4759]: I1205 00:55:23.170829 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f589ff0-e6f7-430a-85a0-4cbc410b0ff8" path="/var/lib/kubelet/pods/4f589ff0-e6f7-430a-85a0-4cbc410b0ff8/volumes" Dec 05 00:55:23 crc kubenswrapper[4759]: I1205 00:55:23.171810 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7ddb84-b42d-4a8b-ae8c-a263a618b408" path="/var/lib/kubelet/pods/ce7ddb84-b42d-4a8b-ae8c-a263a618b408/volumes" Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.068648 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ftb6f"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.086647 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-936e-account-create-update-69tpk"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.097268 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7dea-account-create-update-mjncr"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.106089 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8wnf5"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.120194 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-936e-account-create-update-69tpk"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.136525 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8wnf5"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.151258 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ftb6f"] Dec 05 00:55:24 crc kubenswrapper[4759]: I1205 00:55:24.162729 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7dea-account-create-update-mjncr"] Dec 05 00:55:25 crc kubenswrapper[4759]: I1205 00:55:25.173033 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52600026-ecf0-4ee9-a517-e2e1005d3b5d" path="/var/lib/kubelet/pods/52600026-ecf0-4ee9-a517-e2e1005d3b5d/volumes" Dec 05 00:55:25 crc kubenswrapper[4759]: I1205 00:55:25.174638 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55592703-6d06-487c-a4e4-823be631891e" path="/var/lib/kubelet/pods/55592703-6d06-487c-a4e4-823be631891e/volumes" Dec 05 00:55:25 crc kubenswrapper[4759]: I1205 00:55:25.175502 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db21999b-a97e-4d2f-a9d1-c2f2b7049998" path="/var/lib/kubelet/pods/db21999b-a97e-4d2f-a9d1-c2f2b7049998/volumes" Dec 05 00:55:25 crc kubenswrapper[4759]: I1205 00:55:25.176299 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec67af89-9918-4e80-95e5-a90ed96c7c04" path="/var/lib/kubelet/pods/ec67af89-9918-4e80-95e5-a90ed96c7c04/volumes" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.332540 4759 scope.go:117] "RemoveContainer" containerID="308e6c3fa50fd86f198fcbac21519d3c0e4239dbc90e6ea87b2e95d3ac1923e3" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.383080 4759 scope.go:117] "RemoveContainer" containerID="324b888f605a24c9effa764025721279a8154a57d044279ff27d66542fcae63c" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.428907 4759 scope.go:117] "RemoveContainer" containerID="a8259ea1d2606a0abbc15370e1783aa517587a3b2b1beba8885113c527dcc025" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.461006 4759 scope.go:117] "RemoveContainer" containerID="1b74aed02e69ae52123cba02929413afb2d445abacd4f197515eceae2bbf81de" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.519245 4759 scope.go:117] "RemoveContainer" containerID="cacc93f2f921c3f2f226742855ac925546576dd67c848ae12403ed2aa2cc1f58" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.580187 4759 scope.go:117] "RemoveContainer" containerID="335748906aebd85fd2c5d81cc28c8d61f6bbc466f1d3dc489245408a6376e54f" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.622588 4759 scope.go:117] "RemoveContainer" containerID="6698bd51a4d64dbab26494920e39e1b35b726ac23780c08c197f0392200f84d2" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.669920 4759 scope.go:117] "RemoveContainer" containerID="78d4f07a56d424abbfcd627597f9237a64884cdcb70b081ea01e3bb50949420b" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.709594 4759 scope.go:117] "RemoveContainer" containerID="5d65692d8b3e6b75c1ba3daee41e2d256733b97789e9aa9bdded5fca431b4f97" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.733931 4759 scope.go:117] "RemoveContainer" containerID="f7047d6949ce3f8132b1d2ce7043864eb36118addfe08d0a3e6b08b2ad94a645" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.761178 4759 scope.go:117] "RemoveContainer" containerID="143f6072234304eba121d335063837cc15dea13c3c22abd9cfda90d478fa9e19" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.785472 4759 scope.go:117] "RemoveContainer" containerID="e1cbf0a8e4d70332f965890efdebd0853d650b56061d90f43e8f2ceeb1f27cc7" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.806743 4759 scope.go:117] "RemoveContainer" containerID="60c2aa04d7359cf944b73d19e7ce6bb16e766390f05563dde6e7fc590731357f" Dec 05 00:55:26 crc kubenswrapper[4759]: I1205 00:55:26.827172 4759 scope.go:117] "RemoveContainer" containerID="22443be551834cc4a8f2603a6bd39976dc6754f4c846b3a7a540c8696c611040" Dec 05 00:55:30 crc kubenswrapper[4759]: I1205 00:55:30.035755 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-njwfb"] Dec 05 00:55:30 crc kubenswrapper[4759]: I1205 00:55:30.051799 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-njwfb"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.026775 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4vclp"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.034660 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-bbe7-account-create-update-tmcgm"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.044690 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4vclp"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.070144 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5cmtn"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.084344 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5cmtn"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.094085 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-bbe7-account-create-update-tmcgm"] Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.168040 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035a8e83-7e26-4eb5-939c-4c70a2c86d94" path="/var/lib/kubelet/pods/035a8e83-7e26-4eb5-939c-4c70a2c86d94/volumes" Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.168718 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179ac88c-cac6-44a8-9fa5-54bed12d118c" path="/var/lib/kubelet/pods/179ac88c-cac6-44a8-9fa5-54bed12d118c/volumes" Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.169252 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1f06ab-529e-4253-bf17-1255b1226d3f" path="/var/lib/kubelet/pods/5e1f06ab-529e-4253-bf17-1255b1226d3f/volumes" Dec 05 00:55:31 crc kubenswrapper[4759]: I1205 00:55:31.169792 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="859f7f90-7f35-4c95-80fd-240e91834ff6" path="/var/lib/kubelet/pods/859f7f90-7f35-4c95-80fd-240e91834ff6/volumes" Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.035636 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6349-account-create-update-7kk9v"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.048104 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2b80-account-create-update-8nxsz"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.059571 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3781-account-create-update-2sknq"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.068443 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rn7b5"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.077005 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rn7b5"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.085194 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6349-account-create-update-7kk9v"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.094254 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2b80-account-create-update-8nxsz"] Dec 05 00:55:32 crc kubenswrapper[4759]: I1205 00:55:32.103916 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3781-account-create-update-2sknq"] Dec 05 00:55:33 crc kubenswrapper[4759]: I1205 00:55:33.170816 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb" path="/var/lib/kubelet/pods/4a0cc5a4-5676-4bcb-aa5f-2c84d7fc6aeb/volumes" Dec 05 00:55:33 crc kubenswrapper[4759]: I1205 00:55:33.172264 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a213f68-4a29-4f0d-8f47-fb47a7ed1769" path="/var/lib/kubelet/pods/4a213f68-4a29-4f0d-8f47-fb47a7ed1769/volumes" Dec 05 00:55:33 crc kubenswrapper[4759]: I1205 00:55:33.173582 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dcf39f-2def-45a4-b9f7-b9138e9a1a64" path="/var/lib/kubelet/pods/57dcf39f-2def-45a4-b9f7-b9138e9a1a64/volumes" Dec 05 00:55:33 crc kubenswrapper[4759]: I1205 00:55:33.175276 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaae381-be72-48be-8a0e-eba21df779b7" path="/var/lib/kubelet/pods/beaae381-be72-48be-8a0e-eba21df779b7/volumes" Dec 05 00:55:34 crc kubenswrapper[4759]: I1205 00:55:34.433867 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:55:34 crc kubenswrapper[4759]: I1205 00:55:34.434465 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:55:44 crc kubenswrapper[4759]: I1205 00:55:44.611059 4759 generic.go:334] "Generic (PLEG): container finished" podID="95f4137c-2b87-401f-ba79-56befbbd9757" containerID="97c717893f976cc3ff54ad450decc2c75796424effb83e3348c113847f657353" exitCode=0 Dec 05 00:55:44 crc kubenswrapper[4759]: I1205 00:55:44.611798 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" event={"ID":"95f4137c-2b87-401f-ba79-56befbbd9757","Type":"ContainerDied","Data":"97c717893f976cc3ff54ad450decc2c75796424effb83e3348c113847f657353"} Dec 05 00:55:45 crc kubenswrapper[4759]: I1205 00:55:45.051965 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-x67vg"] Dec 05 00:55:45 crc kubenswrapper[4759]: I1205 00:55:45.067953 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-x67vg"] Dec 05 00:55:45 crc kubenswrapper[4759]: I1205 00:55:45.174425 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d07555-31e9-4b86-b4eb-56b184aef5b7" path="/var/lib/kubelet/pods/e5d07555-31e9-4b86-b4eb-56b184aef5b7/volumes" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.132885 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.208358 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory\") pod \"95f4137c-2b87-401f-ba79-56befbbd9757\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.208435 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key\") pod \"95f4137c-2b87-401f-ba79-56befbbd9757\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.208477 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr94r\" (UniqueName: \"kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r\") pod \"95f4137c-2b87-401f-ba79-56befbbd9757\" (UID: \"95f4137c-2b87-401f-ba79-56befbbd9757\") " Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.214176 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r" (OuterVolumeSpecName: "kube-api-access-hr94r") pod "95f4137c-2b87-401f-ba79-56befbbd9757" (UID: "95f4137c-2b87-401f-ba79-56befbbd9757"). InnerVolumeSpecName "kube-api-access-hr94r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.240287 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory" (OuterVolumeSpecName: "inventory") pod "95f4137c-2b87-401f-ba79-56befbbd9757" (UID: "95f4137c-2b87-401f-ba79-56befbbd9757"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.243987 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "95f4137c-2b87-401f-ba79-56befbbd9757" (UID: "95f4137c-2b87-401f-ba79-56befbbd9757"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.311748 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.311795 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95f4137c-2b87-401f-ba79-56befbbd9757-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.311815 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr94r\" (UniqueName: \"kubernetes.io/projected/95f4137c-2b87-401f-ba79-56befbbd9757-kube-api-access-hr94r\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.644975 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" event={"ID":"95f4137c-2b87-401f-ba79-56befbbd9757","Type":"ContainerDied","Data":"0eb4409342c6acb70b7b5cb7a10106976c545f8aa713d24ada4c73f366d86586"} Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.645021 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb4409342c6acb70b7b5cb7a10106976c545f8aa713d24ada4c73f366d86586" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.645070 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.758087 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb"] Dec 05 00:55:46 crc kubenswrapper[4759]: E1205 00:55:46.758809 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f4137c-2b87-401f-ba79-56befbbd9757" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.758827 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f4137c-2b87-401f-ba79-56befbbd9757" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.759085 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f4137c-2b87-401f-ba79-56befbbd9757" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.760137 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.768218 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb"] Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.799409 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.799476 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.799416 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.799793 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.822403 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.822515 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg2nz\" (UniqueName: \"kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.822567 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.924577 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.924771 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg2nz\" (UniqueName: \"kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.924863 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.930974 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.934058 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:46 crc kubenswrapper[4759]: I1205 00:55:46.952748 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg2nz\" (UniqueName: \"kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ktndb\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:47 crc kubenswrapper[4759]: I1205 00:55:47.114614 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:47 crc kubenswrapper[4759]: I1205 00:55:47.774189 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb"] Dec 05 00:55:47 crc kubenswrapper[4759]: I1205 00:55:47.776285 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 00:55:48 crc kubenswrapper[4759]: I1205 00:55:48.675964 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" event={"ID":"2d4c75d8-1c1c-43d5-b534-34d0b44decf9","Type":"ContainerStarted","Data":"f395de7e32c6dbdc6f6fff9eb361f3cd29adf023fc795ef7718d925ee23e1746"} Dec 05 00:55:48 crc kubenswrapper[4759]: I1205 00:55:48.676580 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" event={"ID":"2d4c75d8-1c1c-43d5-b534-34d0b44decf9","Type":"ContainerStarted","Data":"73ce37d46da65cebf03e468199a3a9de643a7fa867f002a562cc937c457356c2"} Dec 05 00:55:48 crc kubenswrapper[4759]: I1205 00:55:48.694723 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" podStartSLOduration=2.185837362 podStartE2EDuration="2.694704484s" podCreationTimestamp="2025-12-05 00:55:46 +0000 UTC" firstStartedPulling="2025-12-05 00:55:47.775935288 +0000 UTC m=+1966.991596238" lastFinishedPulling="2025-12-05 00:55:48.28480239 +0000 UTC m=+1967.500463360" observedRunningTime="2025-12-05 00:55:48.689206369 +0000 UTC m=+1967.904867319" watchObservedRunningTime="2025-12-05 00:55:48.694704484 +0000 UTC m=+1967.910365434" Dec 05 00:55:53 crc kubenswrapper[4759]: I1205 00:55:53.731729 4759 generic.go:334] "Generic (PLEG): container finished" podID="2d4c75d8-1c1c-43d5-b534-34d0b44decf9" containerID="f395de7e32c6dbdc6f6fff9eb361f3cd29adf023fc795ef7718d925ee23e1746" exitCode=0 Dec 05 00:55:53 crc kubenswrapper[4759]: I1205 00:55:53.731823 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" event={"ID":"2d4c75d8-1c1c-43d5-b534-34d0b44decf9","Type":"ContainerDied","Data":"f395de7e32c6dbdc6f6fff9eb361f3cd29adf023fc795ef7718d925ee23e1746"} Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.279563 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.410200 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory\") pod \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.410313 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg2nz\" (UniqueName: \"kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz\") pod \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.410378 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key\") pod \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\" (UID: \"2d4c75d8-1c1c-43d5-b534-34d0b44decf9\") " Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.416070 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz" (OuterVolumeSpecName: "kube-api-access-fg2nz") pod "2d4c75d8-1c1c-43d5-b534-34d0b44decf9" (UID: "2d4c75d8-1c1c-43d5-b534-34d0b44decf9"). InnerVolumeSpecName "kube-api-access-fg2nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.452880 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d4c75d8-1c1c-43d5-b534-34d0b44decf9" (UID: "2d4c75d8-1c1c-43d5-b534-34d0b44decf9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.470510 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory" (OuterVolumeSpecName: "inventory") pod "2d4c75d8-1c1c-43d5-b534-34d0b44decf9" (UID: "2d4c75d8-1c1c-43d5-b534-34d0b44decf9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.512420 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg2nz\" (UniqueName: \"kubernetes.io/projected/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-kube-api-access-fg2nz\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.512458 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.512474 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d4c75d8-1c1c-43d5-b534-34d0b44decf9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.756447 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" event={"ID":"2d4c75d8-1c1c-43d5-b534-34d0b44decf9","Type":"ContainerDied","Data":"73ce37d46da65cebf03e468199a3a9de643a7fa867f002a562cc937c457356c2"} Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.756503 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.756513 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ce37d46da65cebf03e468199a3a9de643a7fa867f002a562cc937c457356c2" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.887493 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l"] Dec 05 00:55:55 crc kubenswrapper[4759]: E1205 00:55:55.888256 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4c75d8-1c1c-43d5-b534-34d0b44decf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.888328 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4c75d8-1c1c-43d5-b534-34d0b44decf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.888779 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d4c75d8-1c1c-43d5-b534-34d0b44decf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.889738 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.895763 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.896148 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.896500 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.896678 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:55:55 crc kubenswrapper[4759]: I1205 00:55:55.915538 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l"] Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.021510 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phq9\" (UniqueName: \"kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.021607 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.021646 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.123671 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.123822 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phq9\" (UniqueName: \"kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.123887 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.129684 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.129995 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.139786 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phq9\" (UniqueName: \"kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68b7l\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.214184 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:55:56 crc kubenswrapper[4759]: I1205 00:55:56.801740 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l"] Dec 05 00:55:57 crc kubenswrapper[4759]: I1205 00:55:57.781978 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" event={"ID":"d2d871c1-ff7b-408a-85a2-3fe04612bdd4","Type":"ContainerStarted","Data":"f8c4ef36814f43e4d7667fadf427f44517277f93c8611dc722c7e7b298f6ad7f"} Dec 05 00:55:57 crc kubenswrapper[4759]: I1205 00:55:57.782737 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" event={"ID":"d2d871c1-ff7b-408a-85a2-3fe04612bdd4","Type":"ContainerStarted","Data":"1dd06c138acf6943a5042f0fdcf729235c7dbb9184fe36afb8db4e13d0156ece"} Dec 05 00:55:57 crc kubenswrapper[4759]: I1205 00:55:57.815652 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" podStartSLOduration=2.396098683 podStartE2EDuration="2.815625742s" podCreationTimestamp="2025-12-05 00:55:55 +0000 UTC" firstStartedPulling="2025-12-05 00:55:56.809083649 +0000 UTC m=+1976.024744589" lastFinishedPulling="2025-12-05 00:55:57.228610688 +0000 UTC m=+1976.444271648" observedRunningTime="2025-12-05 00:55:57.803591848 +0000 UTC m=+1977.019252838" watchObservedRunningTime="2025-12-05 00:55:57.815625742 +0000 UTC m=+1977.031286732" Dec 05 00:56:04 crc kubenswrapper[4759]: I1205 00:56:04.433122 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:56:04 crc kubenswrapper[4759]: I1205 00:56:04.434005 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:56:16 crc kubenswrapper[4759]: I1205 00:56:16.046658 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hl5mz"] Dec 05 00:56:16 crc kubenswrapper[4759]: I1205 00:56:16.057384 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hl5mz"] Dec 05 00:56:17 crc kubenswrapper[4759]: I1205 00:56:17.032578 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dx9np"] Dec 05 00:56:17 crc kubenswrapper[4759]: I1205 00:56:17.040790 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dx9np"] Dec 05 00:56:17 crc kubenswrapper[4759]: I1205 00:56:17.174513 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="538f07d4-2a1a-47e9-aec3-161f7b23af6f" path="/var/lib/kubelet/pods/538f07d4-2a1a-47e9-aec3-161f7b23af6f/volumes" Dec 05 00:56:17 crc kubenswrapper[4759]: I1205 00:56:17.176596 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885cf08d-63c8-45da-ab2a-18b28a9b0f40" path="/var/lib/kubelet/pods/885cf08d-63c8-45da-ab2a-18b28a9b0f40/volumes" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.063623 4759 scope.go:117] "RemoveContainer" containerID="c2374cb010257bf2b734839ed8de394e9a0ea247c389ee1a58a00c206bf27279" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.109153 4759 scope.go:117] "RemoveContainer" containerID="9de5ddb54008cf35e5d8f0a9131ea80a4c64aa12bf3aae314459c4f332dd5621" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.171178 4759 scope.go:117] "RemoveContainer" containerID="5147e1c6fcf02724f81e0b312ec97933269738613fe307280363a4e04a8e31d1" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.240542 4759 scope.go:117] "RemoveContainer" containerID="39bc75bdc433c1b8bcf585254b2e7c77f08d3faed7e1304bd7a3336ce72874c6" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.316604 4759 scope.go:117] "RemoveContainer" containerID="316fd171e5af8ce8e40018636c8b73154191034b940cf911d03b705f255d9fe0" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.366073 4759 scope.go:117] "RemoveContainer" containerID="41d11559900b9c3cd45ffc9403c2465b3515377629e01c9ec3a82a728480b75e" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.439597 4759 scope.go:117] "RemoveContainer" containerID="e2e8556c5516f39afbf9a226c20463fba9ebafa9b825f57252bd865810f4efc4" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.466209 4759 scope.go:117] "RemoveContainer" containerID="7c9059d078f95d08702fa82d53350199df4cc766991c8d164b379b9c60934964" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.488798 4759 scope.go:117] "RemoveContainer" containerID="54384dd13cc9b46927d505d59e71f259dcb3239c9d1081840a9aba7e9c8ef0c2" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.541156 4759 scope.go:117] "RemoveContainer" containerID="666293e33e341c27b9434994b054f34fc0bbf6d90907a912b42ae4b2c2086123" Dec 05 00:56:27 crc kubenswrapper[4759]: I1205 00:56:27.582730 4759 scope.go:117] "RemoveContainer" containerID="01915c63b208afe0d18292a0df305d5656b0506de0829b0db9e0fee6823d9a82" Dec 05 00:56:32 crc kubenswrapper[4759]: I1205 00:56:32.051193 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s5c5z"] Dec 05 00:56:32 crc kubenswrapper[4759]: I1205 00:56:32.061889 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s5c5z"] Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.050426 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vvtw7"] Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.068412 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hkbq2"] Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.080740 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hkbq2"] Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.092664 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vvtw7"] Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.170730 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419ed25e-6ca1-4ca7-978e-7b1464982278" path="/var/lib/kubelet/pods/419ed25e-6ca1-4ca7-978e-7b1464982278/volumes" Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.171633 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcbbf13-82ed-4c4a-8694-d3149d730cb0" path="/var/lib/kubelet/pods/8bcbbf13-82ed-4c4a-8694-d3149d730cb0/volumes" Dec 05 00:56:33 crc kubenswrapper[4759]: I1205 00:56:33.172383 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5f8952-6ad7-472f-b295-b426f1404270" path="/var/lib/kubelet/pods/9e5f8952-6ad7-472f-b295-b426f1404270/volumes" Dec 05 00:56:34 crc kubenswrapper[4759]: I1205 00:56:34.433007 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:56:34 crc kubenswrapper[4759]: I1205 00:56:34.433395 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:56:34 crc kubenswrapper[4759]: I1205 00:56:34.433444 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:56:34 crc kubenswrapper[4759]: I1205 00:56:34.434145 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:56:34 crc kubenswrapper[4759]: I1205 00:56:34.434217 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8" gracePeriod=600 Dec 05 00:56:35 crc kubenswrapper[4759]: I1205 00:56:35.525135 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8" exitCode=0 Dec 05 00:56:35 crc kubenswrapper[4759]: I1205 00:56:35.525183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8"} Dec 05 00:56:35 crc kubenswrapper[4759]: I1205 00:56:35.525625 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03"} Dec 05 00:56:35 crc kubenswrapper[4759]: I1205 00:56:35.525650 4759 scope.go:117] "RemoveContainer" containerID="543d5cb3545835b76883bef6633b1537928460068abf65c68762ce4345c6c1d1" Dec 05 00:56:39 crc kubenswrapper[4759]: I1205 00:56:39.575002 4759 generic.go:334] "Generic (PLEG): container finished" podID="d2d871c1-ff7b-408a-85a2-3fe04612bdd4" containerID="f8c4ef36814f43e4d7667fadf427f44517277f93c8611dc722c7e7b298f6ad7f" exitCode=0 Dec 05 00:56:39 crc kubenswrapper[4759]: I1205 00:56:39.575107 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" event={"ID":"d2d871c1-ff7b-408a-85a2-3fe04612bdd4","Type":"ContainerDied","Data":"f8c4ef36814f43e4d7667fadf427f44517277f93c8611dc722c7e7b298f6ad7f"} Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.178433 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.214004 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6phq9\" (UniqueName: \"kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9\") pod \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.214090 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key\") pod \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.214390 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory\") pod \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\" (UID: \"d2d871c1-ff7b-408a-85a2-3fe04612bdd4\") " Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.244229 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9" (OuterVolumeSpecName: "kube-api-access-6phq9") pod "d2d871c1-ff7b-408a-85a2-3fe04612bdd4" (UID: "d2d871c1-ff7b-408a-85a2-3fe04612bdd4"). InnerVolumeSpecName "kube-api-access-6phq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.280157 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory" (OuterVolumeSpecName: "inventory") pod "d2d871c1-ff7b-408a-85a2-3fe04612bdd4" (UID: "d2d871c1-ff7b-408a-85a2-3fe04612bdd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.290502 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2d871c1-ff7b-408a-85a2-3fe04612bdd4" (UID: "d2d871c1-ff7b-408a-85a2-3fe04612bdd4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.318770 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6phq9\" (UniqueName: \"kubernetes.io/projected/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-kube-api-access-6phq9\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.318799 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.318809 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2d871c1-ff7b-408a-85a2-3fe04612bdd4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.606159 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.606171 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l" event={"ID":"d2d871c1-ff7b-408a-85a2-3fe04612bdd4","Type":"ContainerDied","Data":"1dd06c138acf6943a5042f0fdcf729235c7dbb9184fe36afb8db4e13d0156ece"} Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.607337 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd06c138acf6943a5042f0fdcf729235c7dbb9184fe36afb8db4e13d0156ece" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.737843 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh"] Dec 05 00:56:41 crc kubenswrapper[4759]: E1205 00:56:41.739098 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d871c1-ff7b-408a-85a2-3fe04612bdd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.739122 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d871c1-ff7b-408a-85a2-3fe04612bdd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.739399 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d871c1-ff7b-408a-85a2-3fe04612bdd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.740348 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.748135 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.748227 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.749545 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.749870 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.755014 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh"] Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.829899 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.830036 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbjf\" (UniqueName: \"kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.830069 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.932052 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbjf\" (UniqueName: \"kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.932130 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.933246 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.942213 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.942294 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:41 crc kubenswrapper[4759]: I1205 00:56:41.952667 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbjf\" (UniqueName: \"kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:42 crc kubenswrapper[4759]: I1205 00:56:42.056480 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:42 crc kubenswrapper[4759]: I1205 00:56:42.793784 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh"] Dec 05 00:56:43 crc kubenswrapper[4759]: I1205 00:56:43.636381 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" event={"ID":"5fdd474f-9093-4247-87af-9731a451fc7f","Type":"ContainerStarted","Data":"d0b1a1d0291c0cfff6e45d51a776ac2278f659e2aa0bc01a05a5ca5a762850c4"} Dec 05 00:56:43 crc kubenswrapper[4759]: I1205 00:56:43.637122 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" event={"ID":"5fdd474f-9093-4247-87af-9731a451fc7f","Type":"ContainerStarted","Data":"37a0580182cacfbd2492266644871f2ed3b8b66ce8308683696f14c03975d97b"} Dec 05 00:56:43 crc kubenswrapper[4759]: I1205 00:56:43.670410 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" podStartSLOduration=2.246888714 podStartE2EDuration="2.670390121s" podCreationTimestamp="2025-12-05 00:56:41 +0000 UTC" firstStartedPulling="2025-12-05 00:56:42.781100154 +0000 UTC m=+2021.996761104" lastFinishedPulling="2025-12-05 00:56:43.204601571 +0000 UTC m=+2022.420262511" observedRunningTime="2025-12-05 00:56:43.656585003 +0000 UTC m=+2022.872245963" watchObservedRunningTime="2025-12-05 00:56:43.670390121 +0000 UTC m=+2022.886051071" Dec 05 00:56:44 crc kubenswrapper[4759]: I1205 00:56:44.030886 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wdw54"] Dec 05 00:56:44 crc kubenswrapper[4759]: I1205 00:56:44.042688 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wdw54"] Dec 05 00:56:45 crc kubenswrapper[4759]: I1205 00:56:45.174714 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe2c3db-f452-4009-abca-b9ee975ad38d" path="/var/lib/kubelet/pods/8fe2c3db-f452-4009-abca-b9ee975ad38d/volumes" Dec 05 00:56:48 crc kubenswrapper[4759]: I1205 00:56:48.708670 4759 generic.go:334] "Generic (PLEG): container finished" podID="5fdd474f-9093-4247-87af-9731a451fc7f" containerID="d0b1a1d0291c0cfff6e45d51a776ac2278f659e2aa0bc01a05a5ca5a762850c4" exitCode=0 Dec 05 00:56:48 crc kubenswrapper[4759]: I1205 00:56:48.708715 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" event={"ID":"5fdd474f-9093-4247-87af-9731a451fc7f","Type":"ContainerDied","Data":"d0b1a1d0291c0cfff6e45d51a776ac2278f659e2aa0bc01a05a5ca5a762850c4"} Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.161297 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.329470 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory\") pod \"5fdd474f-9093-4247-87af-9731a451fc7f\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.329965 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwbjf\" (UniqueName: \"kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf\") pod \"5fdd474f-9093-4247-87af-9731a451fc7f\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.330034 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key\") pod \"5fdd474f-9093-4247-87af-9731a451fc7f\" (UID: \"5fdd474f-9093-4247-87af-9731a451fc7f\") " Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.338044 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf" (OuterVolumeSpecName: "kube-api-access-kwbjf") pod "5fdd474f-9093-4247-87af-9731a451fc7f" (UID: "5fdd474f-9093-4247-87af-9731a451fc7f"). InnerVolumeSpecName "kube-api-access-kwbjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.402577 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5fdd474f-9093-4247-87af-9731a451fc7f" (UID: "5fdd474f-9093-4247-87af-9731a451fc7f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.409207 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory" (OuterVolumeSpecName: "inventory") pod "5fdd474f-9093-4247-87af-9731a451fc7f" (UID: "5fdd474f-9093-4247-87af-9731a451fc7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.432962 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwbjf\" (UniqueName: \"kubernetes.io/projected/5fdd474f-9093-4247-87af-9731a451fc7f-kube-api-access-kwbjf\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.432994 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.433004 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fdd474f-9093-4247-87af-9731a451fc7f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.740619 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" event={"ID":"5fdd474f-9093-4247-87af-9731a451fc7f","Type":"ContainerDied","Data":"37a0580182cacfbd2492266644871f2ed3b8b66ce8308683696f14c03975d97b"} Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.740686 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a0580182cacfbd2492266644871f2ed3b8b66ce8308683696f14c03975d97b" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.740855 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.825871 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls"] Dec 05 00:56:50 crc kubenswrapper[4759]: E1205 00:56:50.826460 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdd474f-9093-4247-87af-9731a451fc7f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.826484 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdd474f-9093-4247-87af-9731a451fc7f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.826746 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdd474f-9093-4247-87af-9731a451fc7f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.827675 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.843700 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.843752 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.843765 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.843969 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.845699 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls"] Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.947519 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6srn\" (UniqueName: \"kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.947711 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:50 crc kubenswrapper[4759]: I1205 00:56:50.947743 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.050390 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6srn\" (UniqueName: \"kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.051082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.051264 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.055712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.055789 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.072133 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6srn\" (UniqueName: \"kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wbvls\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.209099 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:56:51 crc kubenswrapper[4759]: I1205 00:56:51.846677 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls"] Dec 05 00:56:52 crc kubenswrapper[4759]: I1205 00:56:52.770143 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" event={"ID":"80a82f73-12c2-4a77-9bc4-500b26cacfa5","Type":"ContainerStarted","Data":"94591c6f025885339c087f7822450f38a666c8f6bab717d6ffdc7105acf396ea"} Dec 05 00:56:53 crc kubenswrapper[4759]: I1205 00:56:53.787053 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" event={"ID":"80a82f73-12c2-4a77-9bc4-500b26cacfa5","Type":"ContainerStarted","Data":"ba38081fb843b12b5683fc7f75b0990eb2badf6a7e9f6e7cb9056e4b3a53f741"} Dec 05 00:56:53 crc kubenswrapper[4759]: I1205 00:56:53.805772 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" podStartSLOduration=2.573639495 podStartE2EDuration="3.805757875s" podCreationTimestamp="2025-12-05 00:56:50 +0000 UTC" firstStartedPulling="2025-12-05 00:56:51.839101504 +0000 UTC m=+2031.054762454" lastFinishedPulling="2025-12-05 00:56:53.071219844 +0000 UTC m=+2032.286880834" observedRunningTime="2025-12-05 00:56:53.802640729 +0000 UTC m=+2033.018301709" watchObservedRunningTime="2025-12-05 00:56:53.805757875 +0000 UTC m=+2033.021418825" Dec 05 00:57:27 crc kubenswrapper[4759]: I1205 00:57:27.838131 4759 scope.go:117] "RemoveContainer" containerID="dc4d9b28b4fb076966c0c353c9b60133f46742a73a6f8997bc979ccb3230050e" Dec 05 00:57:27 crc kubenswrapper[4759]: I1205 00:57:27.901491 4759 scope.go:117] "RemoveContainer" containerID="9c4a66b1bac8326805e90951a7af9a0adb1cf7a0045b18833bd1e9f4dafc6d82" Dec 05 00:57:27 crc kubenswrapper[4759]: I1205 00:57:27.951386 4759 scope.go:117] "RemoveContainer" containerID="598b58270ecae4c3e21a1f7bb2001bba2b3512ef5d43292a7ed347f7b30bd769" Dec 05 00:57:28 crc kubenswrapper[4759]: I1205 00:57:28.015090 4759 scope.go:117] "RemoveContainer" containerID="f85a35ddba335d95eaf0401b53d370c3576a48693a5629c7b064127025b99058" Dec 05 00:57:32 crc kubenswrapper[4759]: I1205 00:57:32.063385 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hsbmj"] Dec 05 00:57:32 crc kubenswrapper[4759]: I1205 00:57:32.070923 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f9f6-account-create-update-lq98w"] Dec 05 00:57:32 crc kubenswrapper[4759]: I1205 00:57:32.081932 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f9f6-account-create-update-lq98w"] Dec 05 00:57:32 crc kubenswrapper[4759]: I1205 00:57:32.089430 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hsbmj"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.063496 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-78a8-account-create-update-wwr6v"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.079976 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-78a8-account-create-update-wwr6v"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.092406 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vzjc9"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.100207 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bf22-account-create-update-24szb"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.125486 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-h6trw"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.128613 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vzjc9"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.143200 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bf22-account-create-update-24szb"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.155056 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-h6trw"] Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.181148 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018e67eb-28b1-4a1e-8da2-115462fef72a" path="/var/lib/kubelet/pods/018e67eb-28b1-4a1e-8da2-115462fef72a/volumes" Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.182703 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b3ce4d-2b4b-4872-91a5-c353c80d2cb6" path="/var/lib/kubelet/pods/12b3ce4d-2b4b-4872-91a5-c353c80d2cb6/volumes" Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.184108 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710013b3-7fcc-4c39-a383-7361318ae0b6" path="/var/lib/kubelet/pods/710013b3-7fcc-4c39-a383-7361318ae0b6/volumes" Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.185386 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e822d6-72f5-4222-8a41-4cd64c090c13" path="/var/lib/kubelet/pods/b5e822d6-72f5-4222-8a41-4cd64c090c13/volumes" Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.187381 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfe7fd9-c621-4fc3-a3ce-cba2143af712" path="/var/lib/kubelet/pods/ebfe7fd9-c621-4fc3-a3ce-cba2143af712/volumes" Dec 05 00:57:33 crc kubenswrapper[4759]: I1205 00:57:33.188015 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8943e8-c599-484c-a821-c983033ed94a" path="/var/lib/kubelet/pods/fa8943e8-c599-484c-a821-c983033ed94a/volumes" Dec 05 00:57:52 crc kubenswrapper[4759]: I1205 00:57:52.448424 4759 generic.go:334] "Generic (PLEG): container finished" podID="80a82f73-12c2-4a77-9bc4-500b26cacfa5" containerID="ba38081fb843b12b5683fc7f75b0990eb2badf6a7e9f6e7cb9056e4b3a53f741" exitCode=0 Dec 05 00:57:52 crc kubenswrapper[4759]: I1205 00:57:52.448557 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" event={"ID":"80a82f73-12c2-4a77-9bc4-500b26cacfa5","Type":"ContainerDied","Data":"ba38081fb843b12b5683fc7f75b0990eb2badf6a7e9f6e7cb9056e4b3a53f741"} Dec 05 00:57:53 crc kubenswrapper[4759]: I1205 00:57:53.988875 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.011989 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6srn\" (UniqueName: \"kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn\") pod \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.012038 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key\") pod \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.012062 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory\") pod \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\" (UID: \"80a82f73-12c2-4a77-9bc4-500b26cacfa5\") " Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.020930 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn" (OuterVolumeSpecName: "kube-api-access-q6srn") pod "80a82f73-12c2-4a77-9bc4-500b26cacfa5" (UID: "80a82f73-12c2-4a77-9bc4-500b26cacfa5"). InnerVolumeSpecName "kube-api-access-q6srn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.061116 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory" (OuterVolumeSpecName: "inventory") pod "80a82f73-12c2-4a77-9bc4-500b26cacfa5" (UID: "80a82f73-12c2-4a77-9bc4-500b26cacfa5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.071555 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80a82f73-12c2-4a77-9bc4-500b26cacfa5" (UID: "80a82f73-12c2-4a77-9bc4-500b26cacfa5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.119447 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6srn\" (UniqueName: \"kubernetes.io/projected/80a82f73-12c2-4a77-9bc4-500b26cacfa5-kube-api-access-q6srn\") on node \"crc\" DevicePath \"\"" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.119482 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.119498 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80a82f73-12c2-4a77-9bc4-500b26cacfa5-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.480301 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" event={"ID":"80a82f73-12c2-4a77-9bc4-500b26cacfa5","Type":"ContainerDied","Data":"94591c6f025885339c087f7822450f38a666c8f6bab717d6ffdc7105acf396ea"} Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.480674 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.480707 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94591c6f025885339c087f7822450f38a666c8f6bab717d6ffdc7105acf396ea" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.600568 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-92m6k"] Dec 05 00:57:54 crc kubenswrapper[4759]: E1205 00:57:54.601102 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a82f73-12c2-4a77-9bc4-500b26cacfa5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.601126 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a82f73-12c2-4a77-9bc4-500b26cacfa5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.601558 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a82f73-12c2-4a77-9bc4-500b26cacfa5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.602738 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.612938 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.616156 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.616221 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.616482 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.635245 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.635398 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.635554 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gv9l\" (UniqueName: \"kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.638045 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-92m6k"] Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.736462 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.736613 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gv9l\" (UniqueName: \"kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.736641 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.741485 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.742460 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.752522 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gv9l\" (UniqueName: \"kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l\") pod \"ssh-known-hosts-edpm-deployment-92m6k\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:54 crc kubenswrapper[4759]: I1205 00:57:54.950185 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:57:55 crc kubenswrapper[4759]: I1205 00:57:55.563907 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-92m6k"] Dec 05 00:57:56 crc kubenswrapper[4759]: I1205 00:57:56.499490 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" event={"ID":"dfd566ee-35b4-4a14-9683-3b93f9fb272e","Type":"ContainerStarted","Data":"461a52992e724a50afed6f0b61ea7c4df1986850883c035e658dd0ea4e085ced"} Dec 05 00:57:57 crc kubenswrapper[4759]: I1205 00:57:57.519571 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" event={"ID":"dfd566ee-35b4-4a14-9683-3b93f9fb272e","Type":"ContainerStarted","Data":"027a50605e6c7248a4b5999a339ea6c3e6c34d59c18643c8d709e752ba0cddb4"} Dec 05 00:57:57 crc kubenswrapper[4759]: I1205 00:57:57.544276 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" podStartSLOduration=2.961778801 podStartE2EDuration="3.544239744s" podCreationTimestamp="2025-12-05 00:57:54 +0000 UTC" firstStartedPulling="2025-12-05 00:57:55.563265712 +0000 UTC m=+2094.778926672" lastFinishedPulling="2025-12-05 00:57:56.145726665 +0000 UTC m=+2095.361387615" observedRunningTime="2025-12-05 00:57:57.542488432 +0000 UTC m=+2096.758149392" watchObservedRunningTime="2025-12-05 00:57:57.544239744 +0000 UTC m=+2096.759900714" Dec 05 00:58:01 crc kubenswrapper[4759]: I1205 00:58:01.094945 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l4fmq"] Dec 05 00:58:01 crc kubenswrapper[4759]: I1205 00:58:01.108361 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l4fmq"] Dec 05 00:58:01 crc kubenswrapper[4759]: I1205 00:58:01.169083 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee78c1b8-dbc7-496b-b757-4433f2764e67" path="/var/lib/kubelet/pods/ee78c1b8-dbc7-496b-b757-4433f2764e67/volumes" Dec 05 00:58:04 crc kubenswrapper[4759]: I1205 00:58:04.596971 4759 generic.go:334] "Generic (PLEG): container finished" podID="dfd566ee-35b4-4a14-9683-3b93f9fb272e" containerID="027a50605e6c7248a4b5999a339ea6c3e6c34d59c18643c8d709e752ba0cddb4" exitCode=0 Dec 05 00:58:04 crc kubenswrapper[4759]: I1205 00:58:04.597092 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" event={"ID":"dfd566ee-35b4-4a14-9683-3b93f9fb272e","Type":"ContainerDied","Data":"027a50605e6c7248a4b5999a339ea6c3e6c34d59c18643c8d709e752ba0cddb4"} Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.268037 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.388202 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0\") pod \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.388257 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam\") pod \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.388363 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gv9l\" (UniqueName: \"kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l\") pod \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\" (UID: \"dfd566ee-35b4-4a14-9683-3b93f9fb272e\") " Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.393644 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l" (OuterVolumeSpecName: "kube-api-access-8gv9l") pod "dfd566ee-35b4-4a14-9683-3b93f9fb272e" (UID: "dfd566ee-35b4-4a14-9683-3b93f9fb272e"). InnerVolumeSpecName "kube-api-access-8gv9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.422014 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dfd566ee-35b4-4a14-9683-3b93f9fb272e" (UID: "dfd566ee-35b4-4a14-9683-3b93f9fb272e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.430191 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "dfd566ee-35b4-4a14-9683-3b93f9fb272e" (UID: "dfd566ee-35b4-4a14-9683-3b93f9fb272e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.491219 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gv9l\" (UniqueName: \"kubernetes.io/projected/dfd566ee-35b4-4a14-9683-3b93f9fb272e-kube-api-access-8gv9l\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.491254 4759 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.491266 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfd566ee-35b4-4a14-9683-3b93f9fb272e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.621382 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" event={"ID":"dfd566ee-35b4-4a14-9683-3b93f9fb272e","Type":"ContainerDied","Data":"461a52992e724a50afed6f0b61ea7c4df1986850883c035e658dd0ea4e085ced"} Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.621438 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="461a52992e724a50afed6f0b61ea7c4df1986850883c035e658dd0ea4e085ced" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.621441 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-92m6k" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.712260 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks"] Dec 05 00:58:06 crc kubenswrapper[4759]: E1205 00:58:06.713275 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd566ee-35b4-4a14-9683-3b93f9fb272e" containerName="ssh-known-hosts-edpm-deployment" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.713297 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd566ee-35b4-4a14-9683-3b93f9fb272e" containerName="ssh-known-hosts-edpm-deployment" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.713582 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd566ee-35b4-4a14-9683-3b93f9fb272e" containerName="ssh-known-hosts-edpm-deployment" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.714229 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.716714 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.716958 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.721075 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.721550 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.728652 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks"] Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.797951 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.798263 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkndd\" (UniqueName: \"kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.798437 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.900260 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkndd\" (UniqueName: \"kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.900394 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.900475 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.904788 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.905813 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:06 crc kubenswrapper[4759]: I1205 00:58:06.915845 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkndd\" (UniqueName: \"kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2thks\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:07 crc kubenswrapper[4759]: I1205 00:58:07.040037 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:08 crc kubenswrapper[4759]: I1205 00:58:08.944268 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks"] Dec 05 00:58:08 crc kubenswrapper[4759]: W1205 00:58:08.946266 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0c1f555_7a29_4b6c_8047_46941df58dca.slice/crio-66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70 WatchSource:0}: Error finding container 66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70: Status 404 returned error can't find the container with id 66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70 Dec 05 00:58:09 crc kubenswrapper[4759]: I1205 00:58:09.659500 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" event={"ID":"c0c1f555-7a29-4b6c-8047-46941df58dca","Type":"ContainerStarted","Data":"6574019cbcabd3499e8d1afaba101b64b04a81a6f8bd4ef12447cd3f528eba82"} Dec 05 00:58:09 crc kubenswrapper[4759]: I1205 00:58:09.659890 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" event={"ID":"c0c1f555-7a29-4b6c-8047-46941df58dca","Type":"ContainerStarted","Data":"66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70"} Dec 05 00:58:09 crc kubenswrapper[4759]: I1205 00:58:09.678261 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" podStartSLOduration=3.231993902 podStartE2EDuration="3.678233323s" podCreationTimestamp="2025-12-05 00:58:06 +0000 UTC" firstStartedPulling="2025-12-05 00:58:08.949040262 +0000 UTC m=+2108.164701222" lastFinishedPulling="2025-12-05 00:58:09.395279683 +0000 UTC m=+2108.610940643" observedRunningTime="2025-12-05 00:58:09.675693161 +0000 UTC m=+2108.891354161" watchObservedRunningTime="2025-12-05 00:58:09.678233323 +0000 UTC m=+2108.893894313" Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.043181 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-t4ld2"] Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.075764 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-6d43-account-create-update-78sjq"] Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.086968 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-6d43-account-create-update-78sjq"] Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.099426 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-t4ld2"] Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.174531 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a46bc1-98c5-4362-93fe-9a2c140cb04d" path="/var/lib/kubelet/pods/26a46bc1-98c5-4362-93fe-9a2c140cb04d/volumes" Dec 05 00:58:13 crc kubenswrapper[4759]: I1205 00:58:13.175131 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b2620d-e545-46ea-b97e-3fb934b5053c" path="/var/lib/kubelet/pods/26b2620d-e545-46ea-b97e-3fb934b5053c/volumes" Dec 05 00:58:19 crc kubenswrapper[4759]: I1205 00:58:19.046976 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bsbqv"] Dec 05 00:58:19 crc kubenswrapper[4759]: I1205 00:58:19.059867 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bsbqv"] Dec 05 00:58:19 crc kubenswrapper[4759]: I1205 00:58:19.180508 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15826122-66ee-4470-a806-6328f0fbb58c" path="/var/lib/kubelet/pods/15826122-66ee-4470-a806-6328f0fbb58c/volumes" Dec 05 00:58:19 crc kubenswrapper[4759]: I1205 00:58:19.807939 4759 generic.go:334] "Generic (PLEG): container finished" podID="c0c1f555-7a29-4b6c-8047-46941df58dca" containerID="6574019cbcabd3499e8d1afaba101b64b04a81a6f8bd4ef12447cd3f528eba82" exitCode=0 Dec 05 00:58:19 crc kubenswrapper[4759]: I1205 00:58:19.808002 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" event={"ID":"c0c1f555-7a29-4b6c-8047-46941df58dca","Type":"ContainerDied","Data":"6574019cbcabd3499e8d1afaba101b64b04a81a6f8bd4ef12447cd3f528eba82"} Dec 05 00:58:20 crc kubenswrapper[4759]: I1205 00:58:20.063342 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pntsp"] Dec 05 00:58:20 crc kubenswrapper[4759]: I1205 00:58:20.074025 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pntsp"] Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.177503 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f98436-ab15-4fb1-91d5-2b1ae63f25ef" path="/var/lib/kubelet/pods/d2f98436-ab15-4fb1-91d5-2b1ae63f25ef/volumes" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.350540 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.499661 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkndd\" (UniqueName: \"kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd\") pod \"c0c1f555-7a29-4b6c-8047-46941df58dca\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.499925 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key\") pod \"c0c1f555-7a29-4b6c-8047-46941df58dca\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.500010 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory\") pod \"c0c1f555-7a29-4b6c-8047-46941df58dca\" (UID: \"c0c1f555-7a29-4b6c-8047-46941df58dca\") " Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.509613 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd" (OuterVolumeSpecName: "kube-api-access-wkndd") pod "c0c1f555-7a29-4b6c-8047-46941df58dca" (UID: "c0c1f555-7a29-4b6c-8047-46941df58dca"). InnerVolumeSpecName "kube-api-access-wkndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.537688 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c0c1f555-7a29-4b6c-8047-46941df58dca" (UID: "c0c1f555-7a29-4b6c-8047-46941df58dca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.538168 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory" (OuterVolumeSpecName: "inventory") pod "c0c1f555-7a29-4b6c-8047-46941df58dca" (UID: "c0c1f555-7a29-4b6c-8047-46941df58dca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.603032 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.603496 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c1f555-7a29-4b6c-8047-46941df58dca-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.603885 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkndd\" (UniqueName: \"kubernetes.io/projected/c0c1f555-7a29-4b6c-8047-46941df58dca-kube-api-access-wkndd\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.843691 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" event={"ID":"c0c1f555-7a29-4b6c-8047-46941df58dca","Type":"ContainerDied","Data":"66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70"} Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.843755 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66652732898ff1f9a349e7572f8d6613f81a92c9b12d9569b05fc6bcceb8bd70" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.843859 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.921801 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq"] Dec 05 00:58:21 crc kubenswrapper[4759]: E1205 00:58:21.922244 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c1f555-7a29-4b6c-8047-46941df58dca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.922262 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c1f555-7a29-4b6c-8047-46941df58dca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.922534 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c1f555-7a29-4b6c-8047-46941df58dca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.923210 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.926412 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.926797 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.926831 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.926916 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:58:21 crc kubenswrapper[4759]: I1205 00:58:21.940855 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq"] Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.014342 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.014432 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvkp4\" (UniqueName: \"kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.014509 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.116845 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.116913 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvkp4\" (UniqueName: \"kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.116969 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.122440 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.122556 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.160101 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvkp4\" (UniqueName: \"kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.261690 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:22 crc kubenswrapper[4759]: I1205 00:58:22.855536 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq"] Dec 05 00:58:23 crc kubenswrapper[4759]: I1205 00:58:23.870292 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" event={"ID":"a99c592a-6fa3-42dd-af6c-cfb8dc151bff","Type":"ContainerStarted","Data":"66956da93a9224be4a4ad241c32187e0ee258e662f8ccbf2d6d40fb5e91adcce"} Dec 05 00:58:23 crc kubenswrapper[4759]: I1205 00:58:23.870698 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" event={"ID":"a99c592a-6fa3-42dd-af6c-cfb8dc151bff","Type":"ContainerStarted","Data":"dfa6737675399d83fcd77d8ef473d96ae2635ded645120404146c563af9be18b"} Dec 05 00:58:23 crc kubenswrapper[4759]: I1205 00:58:23.889458 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" podStartSLOduration=2.356275228 podStartE2EDuration="2.889435135s" podCreationTimestamp="2025-12-05 00:58:21 +0000 UTC" firstStartedPulling="2025-12-05 00:58:22.854743384 +0000 UTC m=+2122.070404364" lastFinishedPulling="2025-12-05 00:58:23.387903281 +0000 UTC m=+2122.603564271" observedRunningTime="2025-12-05 00:58:23.886435342 +0000 UTC m=+2123.102096302" watchObservedRunningTime="2025-12-05 00:58:23.889435135 +0000 UTC m=+2123.105096085" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.150370 4759 scope.go:117] "RemoveContainer" containerID="6e7299083296aa9b09586cbf7338a73dcbc4c02fedc01d10dc3c2eea7c01c6c7" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.192484 4759 scope.go:117] "RemoveContainer" containerID="889920e7c17f4f567f1b7c953223266e6fadf4eebc3eed05977ac09c646d6f2e" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.284662 4759 scope.go:117] "RemoveContainer" containerID="11e05ea9873d298f6a1fea5b94ea1bfd679a98c041c8e3961c45bca03fd4b558" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.316518 4759 scope.go:117] "RemoveContainer" containerID="d30a68c6629e79dc3a0bef6f425488cc91e9880857d62e70e196b69284dd14cd" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.358402 4759 scope.go:117] "RemoveContainer" containerID="be3dbd8d61f0df32e6d2a66eeb7d6787215f1c0835046283e09095c67f8989e4" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.408048 4759 scope.go:117] "RemoveContainer" containerID="777e420008aabd5db54b0bad93712191fce704248b5ef00c54aabeb0e58cce00" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.457957 4759 scope.go:117] "RemoveContainer" containerID="139b46ad2caaeb13207f6bdd7360947abb7115d2dceb8a7d5e27c4f3ec6396b3" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.484481 4759 scope.go:117] "RemoveContainer" containerID="503ade21aa166f3707a0a1af5b6c51f42c541655ea19ebc11caeac5b2b38e2dc" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.508671 4759 scope.go:117] "RemoveContainer" containerID="fa80d792c5fc55a9de968aea81dc994418f0f1b8c9ecd2b4c368817386f08df0" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.529895 4759 scope.go:117] "RemoveContainer" containerID="3b9bebe3289956d7d5490210b0ff017669d6da94cdb9de41a3498a402b1bc7e0" Dec 05 00:58:28 crc kubenswrapper[4759]: I1205 00:58:28.563975 4759 scope.go:117] "RemoveContainer" containerID="dfc2673f33c52fb8fef51d579367ce11bac54934c259f3e3b8474fcfa6281b66" Dec 05 00:58:34 crc kubenswrapper[4759]: I1205 00:58:34.040000 4759 generic.go:334] "Generic (PLEG): container finished" podID="a99c592a-6fa3-42dd-af6c-cfb8dc151bff" containerID="66956da93a9224be4a4ad241c32187e0ee258e662f8ccbf2d6d40fb5e91adcce" exitCode=0 Dec 05 00:58:34 crc kubenswrapper[4759]: I1205 00:58:34.040089 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" event={"ID":"a99c592a-6fa3-42dd-af6c-cfb8dc151bff","Type":"ContainerDied","Data":"66956da93a9224be4a4ad241c32187e0ee258e662f8ccbf2d6d40fb5e91adcce"} Dec 05 00:58:34 crc kubenswrapper[4759]: I1205 00:58:34.439983 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:58:34 crc kubenswrapper[4759]: I1205 00:58:34.440079 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.526768 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.634388 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory\") pod \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.634844 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvkp4\" (UniqueName: \"kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4\") pod \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.634900 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key\") pod \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\" (UID: \"a99c592a-6fa3-42dd-af6c-cfb8dc151bff\") " Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.642675 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4" (OuterVolumeSpecName: "kube-api-access-fvkp4") pod "a99c592a-6fa3-42dd-af6c-cfb8dc151bff" (UID: "a99c592a-6fa3-42dd-af6c-cfb8dc151bff"). InnerVolumeSpecName "kube-api-access-fvkp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.666504 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a99c592a-6fa3-42dd-af6c-cfb8dc151bff" (UID: "a99c592a-6fa3-42dd-af6c-cfb8dc151bff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.696923 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory" (OuterVolumeSpecName: "inventory") pod "a99c592a-6fa3-42dd-af6c-cfb8dc151bff" (UID: "a99c592a-6fa3-42dd-af6c-cfb8dc151bff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.737987 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvkp4\" (UniqueName: \"kubernetes.io/projected/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-kube-api-access-fvkp4\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.738075 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:35 crc kubenswrapper[4759]: I1205 00:58:35.738104 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99c592a-6fa3-42dd-af6c-cfb8dc151bff-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.065955 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" event={"ID":"a99c592a-6fa3-42dd-af6c-cfb8dc151bff","Type":"ContainerDied","Data":"dfa6737675399d83fcd77d8ef473d96ae2635ded645120404146c563af9be18b"} Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.066011 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa6737675399d83fcd77d8ef473d96ae2635ded645120404146c563af9be18b" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.066038 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.173796 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z"] Dec 05 00:58:36 crc kubenswrapper[4759]: E1205 00:58:36.174244 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99c592a-6fa3-42dd-af6c-cfb8dc151bff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.174263 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99c592a-6fa3-42dd-af6c-cfb8dc151bff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.174486 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99c592a-6fa3-42dd-af6c-cfb8dc151bff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.175224 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.179800 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.179851 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.179921 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.180092 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.180265 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.181402 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.181443 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.181680 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.197760 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z"] Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350502 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350589 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350752 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350788 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq2fd\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350893 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350948 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.350975 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351192 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351464 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351738 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351802 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351862 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.351931 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.453740 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.453904 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.453950 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq2fd\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454136 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454175 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454284 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454395 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454474 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454522 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454599 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454658 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.454699 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.462417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.463076 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.463770 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.464941 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.465801 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.466278 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.466285 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.467124 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.468065 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.469017 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.473390 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.475570 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.485097 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq2fd\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:36 crc kubenswrapper[4759]: I1205 00:58:36.507294 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:58:37 crc kubenswrapper[4759]: I1205 00:58:37.195684 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z"] Dec 05 00:58:38 crc kubenswrapper[4759]: I1205 00:58:38.086717 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" event={"ID":"fe0e55b2-fb6e-4259-a701-84c4599770c7","Type":"ContainerStarted","Data":"253f8a43f31a7ddb7a0ef92eb148f1bbbd98514c5fe45af5e92b18038e4d8157"} Dec 05 00:58:38 crc kubenswrapper[4759]: I1205 00:58:38.087351 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" event={"ID":"fe0e55b2-fb6e-4259-a701-84c4599770c7","Type":"ContainerStarted","Data":"ce7bda84fab6191a15e4f04b1ecc7a4f3c8e971b4371d57978f82e43c8906e7f"} Dec 05 00:58:38 crc kubenswrapper[4759]: I1205 00:58:38.105771 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" podStartSLOduration=1.534659209 podStartE2EDuration="2.105753364s" podCreationTimestamp="2025-12-05 00:58:36 +0000 UTC" firstStartedPulling="2025-12-05 00:58:37.213267099 +0000 UTC m=+2136.428928039" lastFinishedPulling="2025-12-05 00:58:37.784361244 +0000 UTC m=+2137.000022194" observedRunningTime="2025-12-05 00:58:38.103899839 +0000 UTC m=+2137.319560799" watchObservedRunningTime="2025-12-05 00:58:38.105753364 +0000 UTC m=+2137.321414334" Dec 05 00:59:04 crc kubenswrapper[4759]: I1205 00:59:04.435290 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:59:04 crc kubenswrapper[4759]: I1205 00:59:04.436511 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:59:06 crc kubenswrapper[4759]: I1205 00:59:06.050711 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-cnpjh"] Dec 05 00:59:06 crc kubenswrapper[4759]: I1205 00:59:06.064312 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-cnpjh"] Dec 05 00:59:07 crc kubenswrapper[4759]: I1205 00:59:07.176709 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77c176e-03d5-4b5f-908f-a95c826fea16" path="/var/lib/kubelet/pods/d77c176e-03d5-4b5f-908f-a95c826fea16/volumes" Dec 05 00:59:19 crc kubenswrapper[4759]: I1205 00:59:19.641446 4759 generic.go:334] "Generic (PLEG): container finished" podID="fe0e55b2-fb6e-4259-a701-84c4599770c7" containerID="253f8a43f31a7ddb7a0ef92eb148f1bbbd98514c5fe45af5e92b18038e4d8157" exitCode=0 Dec 05 00:59:19 crc kubenswrapper[4759]: I1205 00:59:19.641548 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" event={"ID":"fe0e55b2-fb6e-4259-a701-84c4599770c7","Type":"ContainerDied","Data":"253f8a43f31a7ddb7a0ef92eb148f1bbbd98514c5fe45af5e92b18038e4d8157"} Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.306198 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421492 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq2fd\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421540 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421569 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421654 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421709 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421733 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421755 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.421937 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.422095 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.422131 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.422154 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.422176 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.422218 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle\") pod \"fe0e55b2-fb6e-4259-a701-84c4599770c7\" (UID: \"fe0e55b2-fb6e-4259-a701-84c4599770c7\") " Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.429147 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.429226 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd" (OuterVolumeSpecName: "kube-api-access-pq2fd") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "kube-api-access-pq2fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.431187 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.431239 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.431288 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.431395 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.431861 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.433138 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.437692 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.438106 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.444494 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.479517 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory" (OuterVolumeSpecName: "inventory") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.493880 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fe0e55b2-fb6e-4259-a701-84c4599770c7" (UID: "fe0e55b2-fb6e-4259-a701-84c4599770c7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524443 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524473 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524486 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524496 4759 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524505 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524515 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq2fd\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-kube-api-access-pq2fd\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524524 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524533 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524542 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524555 4759 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524564 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524572 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0e55b2-fb6e-4259-a701-84c4599770c7-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.524581 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fe0e55b2-fb6e-4259-a701-84c4599770c7-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.672609 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" event={"ID":"fe0e55b2-fb6e-4259-a701-84c4599770c7","Type":"ContainerDied","Data":"ce7bda84fab6191a15e4f04b1ecc7a4f3c8e971b4371d57978f82e43c8906e7f"} Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.672677 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7bda84fab6191a15e4f04b1ecc7a4f3c8e971b4371d57978f82e43c8906e7f" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.672764 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.817469 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds"] Dec 05 00:59:21 crc kubenswrapper[4759]: E1205 00:59:21.818023 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0e55b2-fb6e-4259-a701-84c4599770c7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.818056 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0e55b2-fb6e-4259-a701-84c4599770c7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.818441 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0e55b2-fb6e-4259-a701-84c4599770c7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.819564 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.824362 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.824421 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.824559 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.824679 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.825389 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.864190 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds"] Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.932510 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.932780 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fl6\" (UniqueName: \"kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.932912 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.933214 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:21 crc kubenswrapper[4759]: I1205 00:59:21.933257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.036005 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.036477 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.036707 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.036892 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fl6\" (UniqueName: \"kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.037181 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.038712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.044896 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.045008 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.060954 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fl6\" (UniqueName: \"kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.066441 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9tbds\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.154710 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 00:59:22 crc kubenswrapper[4759]: I1205 00:59:22.825278 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds"] Dec 05 00:59:23 crc kubenswrapper[4759]: I1205 00:59:23.708845 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" event={"ID":"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b","Type":"ContainerStarted","Data":"35e5fde3a54f0bc404fcecaf2841d6622c756c96f53e86074b4ada2bb9c74518"} Dec 05 00:59:24 crc kubenswrapper[4759]: I1205 00:59:24.723102 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" event={"ID":"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b","Type":"ContainerStarted","Data":"c45d18f5975f8e50a07bbeed5c0c4b8d9ecabe797e96a53472f8efc06eefec3a"} Dec 05 00:59:24 crc kubenswrapper[4759]: I1205 00:59:24.745762 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" podStartSLOduration=3.105002047 podStartE2EDuration="3.745745154s" podCreationTimestamp="2025-12-05 00:59:21 +0000 UTC" firstStartedPulling="2025-12-05 00:59:22.80064491 +0000 UTC m=+2182.016305860" lastFinishedPulling="2025-12-05 00:59:23.441388007 +0000 UTC m=+2182.657048967" observedRunningTime="2025-12-05 00:59:24.744577565 +0000 UTC m=+2183.960238515" watchObservedRunningTime="2025-12-05 00:59:24.745745154 +0000 UTC m=+2183.961406104" Dec 05 00:59:28 crc kubenswrapper[4759]: I1205 00:59:28.882924 4759 scope.go:117] "RemoveContainer" containerID="89604e724134de4f98f0226eb9699c41ddc6d675cd76437609c5690f082425fa" Dec 05 00:59:29 crc kubenswrapper[4759]: I1205 00:59:29.991618 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:29 crc kubenswrapper[4759]: I1205 00:59:29.997019 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.022147 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.156495 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.156649 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcpf\" (UniqueName: \"kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.156751 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.259036 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.259238 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcpf\" (UniqueName: \"kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.259407 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.259853 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.260563 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.284341 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcpf\" (UniqueName: \"kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf\") pod \"community-operators-x7wb2\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.328399 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:30 crc kubenswrapper[4759]: I1205 00:59:30.942927 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:31 crc kubenswrapper[4759]: I1205 00:59:31.812357 4759 generic.go:334] "Generic (PLEG): container finished" podID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerID="d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9" exitCode=0 Dec 05 00:59:31 crc kubenswrapper[4759]: I1205 00:59:31.812419 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerDied","Data":"d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9"} Dec 05 00:59:31 crc kubenswrapper[4759]: I1205 00:59:31.812718 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerStarted","Data":"4eeb49c98d7840c9813960b4a529b7d275e999ea37a103beb180fa7650d9e030"} Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.433608 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.434356 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.434415 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.435484 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.435564 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" gracePeriod=600 Dec 05 00:59:34 crc kubenswrapper[4759]: I1205 00:59:34.845460 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerStarted","Data":"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980"} Dec 05 00:59:35 crc kubenswrapper[4759]: E1205 00:59:35.072501 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.861937 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" exitCode=0 Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.862054 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03"} Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.862496 4759 scope.go:117] "RemoveContainer" containerID="b621a27b3e273f787ffd52c76f267d74b63dedbd64dc94442d2b4dbf1f2c61f8" Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.863512 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 00:59:35 crc kubenswrapper[4759]: E1205 00:59:35.864028 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.866542 4759 generic.go:334] "Generic (PLEG): container finished" podID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerID="43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980" exitCode=0 Dec 05 00:59:35 crc kubenswrapper[4759]: I1205 00:59:35.866592 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerDied","Data":"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980"} Dec 05 00:59:36 crc kubenswrapper[4759]: I1205 00:59:36.930100 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerStarted","Data":"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c"} Dec 05 00:59:40 crc kubenswrapper[4759]: I1205 00:59:40.329621 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:40 crc kubenswrapper[4759]: I1205 00:59:40.331627 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:40 crc kubenswrapper[4759]: I1205 00:59:40.445763 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:40 crc kubenswrapper[4759]: I1205 00:59:40.496581 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x7wb2" podStartSLOduration=7.009979632 podStartE2EDuration="11.496538465s" podCreationTimestamp="2025-12-05 00:59:29 +0000 UTC" firstStartedPulling="2025-12-05 00:59:31.815998605 +0000 UTC m=+2191.031659555" lastFinishedPulling="2025-12-05 00:59:36.302557408 +0000 UTC m=+2195.518218388" observedRunningTime="2025-12-05 00:59:36.950283686 +0000 UTC m=+2196.165944626" watchObservedRunningTime="2025-12-05 00:59:40.496538465 +0000 UTC m=+2199.712199465" Dec 05 00:59:50 crc kubenswrapper[4759]: I1205 00:59:50.157072 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 00:59:50 crc kubenswrapper[4759]: E1205 00:59:50.158480 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 00:59:50 crc kubenswrapper[4759]: I1205 00:59:50.415827 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:50 crc kubenswrapper[4759]: I1205 00:59:50.492623 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.117094 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x7wb2" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="registry-server" containerID="cri-o://e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c" gracePeriod=2 Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.699597 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.825077 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities\") pod \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.825234 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content\") pod \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.825388 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhcpf\" (UniqueName: \"kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf\") pod \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\" (UID: \"e7653009-a876-47dd-bfd6-8ed650f0fdb9\") " Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.826430 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities" (OuterVolumeSpecName: "utilities") pod "e7653009-a876-47dd-bfd6-8ed650f0fdb9" (UID: "e7653009-a876-47dd-bfd6-8ed650f0fdb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.833198 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf" (OuterVolumeSpecName: "kube-api-access-jhcpf") pod "e7653009-a876-47dd-bfd6-8ed650f0fdb9" (UID: "e7653009-a876-47dd-bfd6-8ed650f0fdb9"). InnerVolumeSpecName "kube-api-access-jhcpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.877113 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7653009-a876-47dd-bfd6-8ed650f0fdb9" (UID: "e7653009-a876-47dd-bfd6-8ed650f0fdb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.928234 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.928321 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhcpf\" (UniqueName: \"kubernetes.io/projected/e7653009-a876-47dd-bfd6-8ed650f0fdb9-kube-api-access-jhcpf\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:51 crc kubenswrapper[4759]: I1205 00:59:51.928338 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7653009-a876-47dd-bfd6-8ed650f0fdb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.129943 4759 generic.go:334] "Generic (PLEG): container finished" podID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerID="e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c" exitCode=0 Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.129986 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerDied","Data":"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c"} Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.130021 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7wb2" event={"ID":"e7653009-a876-47dd-bfd6-8ed650f0fdb9","Type":"ContainerDied","Data":"4eeb49c98d7840c9813960b4a529b7d275e999ea37a103beb180fa7650d9e030"} Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.130043 4759 scope.go:117] "RemoveContainer" containerID="e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.130061 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7wb2" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.178477 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.184812 4759 scope.go:117] "RemoveContainer" containerID="43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.187837 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x7wb2"] Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.245038 4759 scope.go:117] "RemoveContainer" containerID="d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.281438 4759 scope.go:117] "RemoveContainer" containerID="e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c" Dec 05 00:59:52 crc kubenswrapper[4759]: E1205 00:59:52.281968 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c\": container with ID starting with e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c not found: ID does not exist" containerID="e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.282016 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c"} err="failed to get container status \"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c\": rpc error: code = NotFound desc = could not find container \"e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c\": container with ID starting with e97d3a7a72958d1be1873360a85596f4eba90113c880aa5d92225ef1d4f1c38c not found: ID does not exist" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.282052 4759 scope.go:117] "RemoveContainer" containerID="43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980" Dec 05 00:59:52 crc kubenswrapper[4759]: E1205 00:59:52.282476 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980\": container with ID starting with 43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980 not found: ID does not exist" containerID="43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.282671 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980"} err="failed to get container status \"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980\": rpc error: code = NotFound desc = could not find container \"43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980\": container with ID starting with 43c1a23dfdfd604b1e206dfe1edb8f83d1e51151405733515aa50bcc968e8980 not found: ID does not exist" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.282870 4759 scope.go:117] "RemoveContainer" containerID="d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9" Dec 05 00:59:52 crc kubenswrapper[4759]: E1205 00:59:52.283522 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9\": container with ID starting with d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9 not found: ID does not exist" containerID="d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9" Dec 05 00:59:52 crc kubenswrapper[4759]: I1205 00:59:52.283560 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9"} err="failed to get container status \"d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9\": rpc error: code = NotFound desc = could not find container \"d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9\": container with ID starting with d1f25210e287119a5cd874f0f08c4b0b2872bfd86fde069426259a71d87d80f9 not found: ID does not exist" Dec 05 00:59:53 crc kubenswrapper[4759]: I1205 00:59:53.168019 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" path="/var/lib/kubelet/pods/e7653009-a876-47dd-bfd6-8ed650f0fdb9/volumes" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.272169 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 00:59:59 crc kubenswrapper[4759]: E1205 00:59:59.274053 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="extract-utilities" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.274096 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="extract-utilities" Dec 05 00:59:59 crc kubenswrapper[4759]: E1205 00:59:59.274147 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="registry-server" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.274163 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="registry-server" Dec 05 00:59:59 crc kubenswrapper[4759]: E1205 00:59:59.274186 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="extract-content" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.274202 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="extract-content" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.274720 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7653009-a876-47dd-bfd6-8ed650f0fdb9" containerName="registry-server" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.278454 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.286647 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.326864 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.326935 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.327057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sr8\" (UniqueName: \"kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.429333 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sr8\" (UniqueName: \"kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.429528 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.429566 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.430088 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.430529 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.453601 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sr8\" (UniqueName: \"kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8\") pod \"redhat-marketplace-65klv\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 00:59:59 crc kubenswrapper[4759]: I1205 00:59:59.615130 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.146806 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.178823 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7"] Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.180637 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.183768 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.183950 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.191690 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7"] Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.230759 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerStarted","Data":"1b847331d9763f0377fca2a52ea0983ff9876d8fedfeaf4938a066d3267a8721"} Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.243040 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmz7\" (UniqueName: \"kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.243124 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.243223 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.346371 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.346710 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srmz7\" (UniqueName: \"kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.346799 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.348764 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.360157 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.364945 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmz7\" (UniqueName: \"kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7\") pod \"collect-profiles-29414940-97xc7\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:00 crc kubenswrapper[4759]: I1205 01:00:00.650750 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:01 crc kubenswrapper[4759]: I1205 01:00:01.232823 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7"] Dec 05 01:00:01 crc kubenswrapper[4759]: I1205 01:00:01.245884 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" event={"ID":"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04","Type":"ContainerStarted","Data":"8000dbe37b61e7a197f70e47d326e36a376de8003b87c656aaf66b16dfd37bd2"} Dec 05 01:00:01 crc kubenswrapper[4759]: I1205 01:00:01.248854 4759 generic.go:334] "Generic (PLEG): container finished" podID="58b8e311-215d-4516-a9b1-3090d472a15b" containerID="90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556" exitCode=0 Dec 05 01:00:01 crc kubenswrapper[4759]: I1205 01:00:01.248934 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerDied","Data":"90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556"} Dec 05 01:00:02 crc kubenswrapper[4759]: I1205 01:00:02.156706 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:00:02 crc kubenswrapper[4759]: E1205 01:00:02.157235 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:00:02 crc kubenswrapper[4759]: I1205 01:00:02.262414 4759 generic.go:334] "Generic (PLEG): container finished" podID="7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" containerID="a693af9082d34c20ecbba90a5cf5dcf55d4c3ad96f64e20a3594a1e39e506122" exitCode=0 Dec 05 01:00:02 crc kubenswrapper[4759]: I1205 01:00:02.262867 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" event={"ID":"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04","Type":"ContainerDied","Data":"a693af9082d34c20ecbba90a5cf5dcf55d4c3ad96f64e20a3594a1e39e506122"} Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.756416 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.828362 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srmz7\" (UniqueName: \"kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7\") pod \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.828531 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume\") pod \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.828568 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume\") pod \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\" (UID: \"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04\") " Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.830289 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" (UID: "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.836766 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" (UID: "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.842596 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7" (OuterVolumeSpecName: "kube-api-access-srmz7") pod "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" (UID: "7a1a3bf6-d1bb-407b-b081-7656a0ffaa04"). InnerVolumeSpecName "kube-api-access-srmz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.931887 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srmz7\" (UniqueName: \"kubernetes.io/projected/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-kube-api-access-srmz7\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.931917 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:03 crc kubenswrapper[4759]: I1205 01:00:03.931928 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:04 crc kubenswrapper[4759]: I1205 01:00:04.304754 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" event={"ID":"7a1a3bf6-d1bb-407b-b081-7656a0ffaa04","Type":"ContainerDied","Data":"8000dbe37b61e7a197f70e47d326e36a376de8003b87c656aaf66b16dfd37bd2"} Dec 05 01:00:04 crc kubenswrapper[4759]: I1205 01:00:04.304812 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8000dbe37b61e7a197f70e47d326e36a376de8003b87c656aaf66b16dfd37bd2" Dec 05 01:00:04 crc kubenswrapper[4759]: I1205 01:00:04.304830 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7" Dec 05 01:00:04 crc kubenswrapper[4759]: I1205 01:00:04.855137 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw"] Dec 05 01:00:04 crc kubenswrapper[4759]: I1205 01:00:04.867272 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414895-vsxsw"] Dec 05 01:00:05 crc kubenswrapper[4759]: I1205 01:00:05.176874 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba656f8-77bb-4402-8242-6fe3b116a8cc" path="/var/lib/kubelet/pods/9ba656f8-77bb-4402-8242-6fe3b116a8cc/volumes" Dec 05 01:00:14 crc kubenswrapper[4759]: I1205 01:00:14.156297 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:00:14 crc kubenswrapper[4759]: E1205 01:00:14.157364 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:00:27 crc kubenswrapper[4759]: I1205 01:00:27.048998 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-q2dv8"] Dec 05 01:00:27 crc kubenswrapper[4759]: I1205 01:00:27.061415 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-q2dv8"] Dec 05 01:00:27 crc kubenswrapper[4759]: I1205 01:00:27.167911 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f827c4c1-9ca4-455c-9e82-67ee6d95f5fd" path="/var/lib/kubelet/pods/f827c4c1-9ca4-455c-9e82-67ee6d95f5fd/volumes" Dec 05 01:00:28 crc kubenswrapper[4759]: I1205 01:00:28.157350 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:00:28 crc kubenswrapper[4759]: E1205 01:00:28.158480 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:00:28 crc kubenswrapper[4759]: I1205 01:00:28.980489 4759 scope.go:117] "RemoveContainer" containerID="5e40b7708eb1143bd097a614bd1b2e025aedb1b829aeefe678d239f283ef0b8a" Dec 05 01:00:29 crc kubenswrapper[4759]: I1205 01:00:29.023133 4759 scope.go:117] "RemoveContainer" containerID="7de19b6e6b0c37f09a38cb5eb6356e72eb94a1f7f8996bad2546224cda89f8bd" Dec 05 01:00:38 crc kubenswrapper[4759]: I1205 01:00:38.786280 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" event={"ID":"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b","Type":"ContainerDied","Data":"c45d18f5975f8e50a07bbeed5c0c4b8d9ecabe797e96a53472f8efc06eefec3a"} Dec 05 01:00:38 crc kubenswrapper[4759]: I1205 01:00:38.786340 4759 generic.go:334] "Generic (PLEG): container finished" podID="35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" containerID="c45d18f5975f8e50a07bbeed5c0c4b8d9ecabe797e96a53472f8efc06eefec3a" exitCode=0 Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.479723 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.633090 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fl6\" (UniqueName: \"kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6\") pod \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.633148 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory\") pod \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.633204 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0\") pod \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.633230 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle\") pod \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.633288 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key\") pod \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\" (UID: \"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b\") " Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.638631 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6" (OuterVolumeSpecName: "kube-api-access-n9fl6") pod "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" (UID: "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b"). InnerVolumeSpecName "kube-api-access-n9fl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.641572 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" (UID: "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.662621 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory" (OuterVolumeSpecName: "inventory") pod "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" (UID: "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.691279 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" (UID: "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.694274 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" (UID: "35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.736482 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fl6\" (UniqueName: \"kubernetes.io/projected/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-kube-api-access-n9fl6\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.736516 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.736529 4759 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.736541 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.736552 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.815673 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" event={"ID":"35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b","Type":"ContainerDied","Data":"35e5fde3a54f0bc404fcecaf2841d6622c756c96f53e86074b4ada2bb9c74518"} Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.815722 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e5fde3a54f0bc404fcecaf2841d6622c756c96f53e86074b4ada2bb9c74518" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.815721 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.950698 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb"] Dec 05 01:00:40 crc kubenswrapper[4759]: E1205 01:00:40.951766 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.951793 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:00:40 crc kubenswrapper[4759]: E1205 01:00:40.951944 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" containerName="collect-profiles" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.951981 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" containerName="collect-profiles" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.952778 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" containerName="collect-profiles" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.952853 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.954853 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.958326 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.958392 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.958571 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.958568 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.958825 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:00:40 crc kubenswrapper[4759]: I1205 01:00:40.967235 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb"] Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.046177 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.046505 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.046667 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.046739 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.046831 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6zw\" (UniqueName: \"kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.149330 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.149779 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.149823 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.149874 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6zw\" (UniqueName: \"kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.149961 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.157949 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.157975 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.158534 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.161536 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.180164 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6zw\" (UniqueName: \"kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.318576 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.765813 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb"] Dec 05 01:00:41 crc kubenswrapper[4759]: I1205 01:00:41.858499 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" event={"ID":"c3b3d06e-6304-425d-b688-524cfbf7ea5a","Type":"ContainerStarted","Data":"20528f14dd0cf06313742969b6d60efa518969ed71b12f1952fa326efdad763d"} Dec 05 01:00:42 crc kubenswrapper[4759]: I1205 01:00:42.155857 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:00:42 crc kubenswrapper[4759]: E1205 01:00:42.156325 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:00:42 crc kubenswrapper[4759]: I1205 01:00:42.873071 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" event={"ID":"c3b3d06e-6304-425d-b688-524cfbf7ea5a","Type":"ContainerStarted","Data":"c801697c7f41025dc5e6dd5f0c573bfc35f754dce7df3f2a04fffd73b4580b53"} Dec 05 01:00:42 crc kubenswrapper[4759]: I1205 01:00:42.905298 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" podStartSLOduration=2.347682859 podStartE2EDuration="2.90527989s" podCreationTimestamp="2025-12-05 01:00:40 +0000 UTC" firstStartedPulling="2025-12-05 01:00:41.792489877 +0000 UTC m=+2261.008150827" lastFinishedPulling="2025-12-05 01:00:42.350086898 +0000 UTC m=+2261.565747858" observedRunningTime="2025-12-05 01:00:42.899165991 +0000 UTC m=+2262.114826961" watchObservedRunningTime="2025-12-05 01:00:42.90527989 +0000 UTC m=+2262.120940850" Dec 05 01:00:53 crc kubenswrapper[4759]: I1205 01:00:53.157284 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:00:53 crc kubenswrapper[4759]: E1205 01:00:53.158419 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.190106 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414941-zc2s9"] Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.192247 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.199299 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414941-zc2s9"] Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.301976 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.302134 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.302227 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8smkr\" (UniqueName: \"kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.302331 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.405114 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.405234 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8smkr\" (UniqueName: \"kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.405366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.405558 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.412725 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.418258 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.419078 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.430927 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8smkr\" (UniqueName: \"kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr\") pod \"keystone-cron-29414941-zc2s9\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:00 crc kubenswrapper[4759]: I1205 01:01:00.520796 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:01 crc kubenswrapper[4759]: I1205 01:01:01.070608 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414941-zc2s9"] Dec 05 01:01:01 crc kubenswrapper[4759]: E1205 01:01:01.079454 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: reading manifest sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594 in registry.redhat.io/redhat/redhat-marketplace-index: received unexpected HTTP status: 502 Bad Gateway" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 01:01:01 crc kubenswrapper[4759]: E1205 01:01:01.079680 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2sr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-65klv_openshift-marketplace(58b8e311-215d-4516-a9b1-3090d472a15b): ErrImagePull: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: reading manifest sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594 in registry.redhat.io/redhat/redhat-marketplace-index: received unexpected HTTP status: 502 Bad Gateway" logger="UnhandledError" Dec 05 01:01:01 crc kubenswrapper[4759]: E1205 01:01:01.080884 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: reading manifest sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594 in registry.redhat.io/redhat/redhat-marketplace-index: received unexpected HTTP status: 502 Bad Gateway\"" pod="openshift-marketplace/redhat-marketplace-65klv" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" Dec 05 01:01:01 crc kubenswrapper[4759]: I1205 01:01:01.112996 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414941-zc2s9" event={"ID":"411ce212-9655-4a4e-8056-adcbaf433178","Type":"ContainerStarted","Data":"708b536563befd68f9ddee8df91d121c39ed5a787b53aac985df75e2ea3ae48d"} Dec 05 01:01:01 crc kubenswrapper[4759]: E1205 01:01:01.130095 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-65klv" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" Dec 05 01:01:02 crc kubenswrapper[4759]: I1205 01:01:02.128805 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414941-zc2s9" event={"ID":"411ce212-9655-4a4e-8056-adcbaf433178","Type":"ContainerStarted","Data":"17891a5ee9375ad09feedde6389a60b0d13e8e21734664047cfdb0a07789530e"} Dec 05 01:01:02 crc kubenswrapper[4759]: I1205 01:01:02.156524 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414941-zc2s9" podStartSLOduration=2.156501216 podStartE2EDuration="2.156501216s" podCreationTimestamp="2025-12-05 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:01:02.144975293 +0000 UTC m=+2281.360636243" watchObservedRunningTime="2025-12-05 01:01:02.156501216 +0000 UTC m=+2281.372162166" Dec 05 01:01:04 crc kubenswrapper[4759]: I1205 01:01:04.160412 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:01:04 crc kubenswrapper[4759]: E1205 01:01:04.162298 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:01:04 crc kubenswrapper[4759]: I1205 01:01:04.176440 4759 generic.go:334] "Generic (PLEG): container finished" podID="411ce212-9655-4a4e-8056-adcbaf433178" containerID="17891a5ee9375ad09feedde6389a60b0d13e8e21734664047cfdb0a07789530e" exitCode=0 Dec 05 01:01:04 crc kubenswrapper[4759]: I1205 01:01:04.176660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414941-zc2s9" event={"ID":"411ce212-9655-4a4e-8056-adcbaf433178","Type":"ContainerDied","Data":"17891a5ee9375ad09feedde6389a60b0d13e8e21734664047cfdb0a07789530e"} Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.734513 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.858917 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys\") pod \"411ce212-9655-4a4e-8056-adcbaf433178\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.858985 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle\") pod \"411ce212-9655-4a4e-8056-adcbaf433178\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.859020 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8smkr\" (UniqueName: \"kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr\") pod \"411ce212-9655-4a4e-8056-adcbaf433178\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.859078 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data\") pod \"411ce212-9655-4a4e-8056-adcbaf433178\" (UID: \"411ce212-9655-4a4e-8056-adcbaf433178\") " Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.868545 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr" (OuterVolumeSpecName: "kube-api-access-8smkr") pod "411ce212-9655-4a4e-8056-adcbaf433178" (UID: "411ce212-9655-4a4e-8056-adcbaf433178"). InnerVolumeSpecName "kube-api-access-8smkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.876430 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "411ce212-9655-4a4e-8056-adcbaf433178" (UID: "411ce212-9655-4a4e-8056-adcbaf433178"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.905073 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "411ce212-9655-4a4e-8056-adcbaf433178" (UID: "411ce212-9655-4a4e-8056-adcbaf433178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.938672 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data" (OuterVolumeSpecName: "config-data") pod "411ce212-9655-4a4e-8056-adcbaf433178" (UID: "411ce212-9655-4a4e-8056-adcbaf433178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.961582 4759 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.961646 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.961669 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8smkr\" (UniqueName: \"kubernetes.io/projected/411ce212-9655-4a4e-8056-adcbaf433178-kube-api-access-8smkr\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:05 crc kubenswrapper[4759]: I1205 01:01:05.961689 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411ce212-9655-4a4e-8056-adcbaf433178-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:06 crc kubenswrapper[4759]: I1205 01:01:06.206413 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414941-zc2s9" event={"ID":"411ce212-9655-4a4e-8056-adcbaf433178","Type":"ContainerDied","Data":"708b536563befd68f9ddee8df91d121c39ed5a787b53aac985df75e2ea3ae48d"} Dec 05 01:01:06 crc kubenswrapper[4759]: I1205 01:01:06.206457 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="708b536563befd68f9ddee8df91d121c39ed5a787b53aac985df75e2ea3ae48d" Dec 05 01:01:06 crc kubenswrapper[4759]: I1205 01:01:06.206520 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414941-zc2s9" Dec 05 01:01:07 crc kubenswrapper[4759]: I1205 01:01:07.050805 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-jf2gl"] Dec 05 01:01:07 crc kubenswrapper[4759]: I1205 01:01:07.061822 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-jf2gl"] Dec 05 01:01:07 crc kubenswrapper[4759]: I1205 01:01:07.169980 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c9a7af-b42a-4d69-93e1-78040788fe1b" path="/var/lib/kubelet/pods/d7c9a7af-b42a-4d69-93e1-78040788fe1b/volumes" Dec 05 01:01:12 crc kubenswrapper[4759]: I1205 01:01:12.160760 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:01:17 crc kubenswrapper[4759]: I1205 01:01:17.157680 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:01:17 crc kubenswrapper[4759]: E1205 01:01:17.158641 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:01:21 crc kubenswrapper[4759]: I1205 01:01:21.446924 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerStarted","Data":"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea"} Dec 05 01:01:22 crc kubenswrapper[4759]: I1205 01:01:22.459740 4759 generic.go:334] "Generic (PLEG): container finished" podID="58b8e311-215d-4516-a9b1-3090d472a15b" containerID="c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea" exitCode=0 Dec 05 01:01:22 crc kubenswrapper[4759]: I1205 01:01:22.459797 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerDied","Data":"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea"} Dec 05 01:01:23 crc kubenswrapper[4759]: I1205 01:01:23.473548 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerStarted","Data":"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8"} Dec 05 01:01:23 crc kubenswrapper[4759]: I1205 01:01:23.503173 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-65klv" podStartSLOduration=2.90326272 podStartE2EDuration="1m24.503152107s" podCreationTimestamp="2025-12-05 00:59:59 +0000 UTC" firstStartedPulling="2025-12-05 01:00:01.251360154 +0000 UTC m=+2220.467021104" lastFinishedPulling="2025-12-05 01:01:22.851249531 +0000 UTC m=+2302.066910491" observedRunningTime="2025-12-05 01:01:23.50085406 +0000 UTC m=+2302.716515020" watchObservedRunningTime="2025-12-05 01:01:23.503152107 +0000 UTC m=+2302.718813067" Dec 05 01:01:29 crc kubenswrapper[4759]: I1205 01:01:29.154335 4759 scope.go:117] "RemoveContainer" containerID="da348c22afd5b86083ff116e053f1c904b4c3ec815b586f7d49604d77593b075" Dec 05 01:01:29 crc kubenswrapper[4759]: I1205 01:01:29.616541 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:29 crc kubenswrapper[4759]: I1205 01:01:29.616726 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:29 crc kubenswrapper[4759]: I1205 01:01:29.686594 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:30 crc kubenswrapper[4759]: I1205 01:01:30.640033 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:30 crc kubenswrapper[4759]: I1205 01:01:30.724135 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 01:01:32 crc kubenswrapper[4759]: I1205 01:01:32.158396 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:01:32 crc kubenswrapper[4759]: E1205 01:01:32.159065 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:01:32 crc kubenswrapper[4759]: I1205 01:01:32.572622 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-65klv" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="registry-server" containerID="cri-o://90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8" gracePeriod=2 Dec 05 01:01:32 crc kubenswrapper[4759]: E1205 01:01:32.716939 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b8e311_215d_4516_a9b1_3090d472a15b.slice/crio-90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b8e311_215d_4516_a9b1_3090d472a15b.slice/crio-conmon-90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.067352 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.122526 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities\") pod \"58b8e311-215d-4516-a9b1-3090d472a15b\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.122597 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2sr8\" (UniqueName: \"kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8\") pod \"58b8e311-215d-4516-a9b1-3090d472a15b\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.122692 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content\") pod \"58b8e311-215d-4516-a9b1-3090d472a15b\" (UID: \"58b8e311-215d-4516-a9b1-3090d472a15b\") " Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.124925 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities" (OuterVolumeSpecName: "utilities") pod "58b8e311-215d-4516-a9b1-3090d472a15b" (UID: "58b8e311-215d-4516-a9b1-3090d472a15b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.133636 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8" (OuterVolumeSpecName: "kube-api-access-f2sr8") pod "58b8e311-215d-4516-a9b1-3090d472a15b" (UID: "58b8e311-215d-4516-a9b1-3090d472a15b"). InnerVolumeSpecName "kube-api-access-f2sr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.143806 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.143847 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2sr8\" (UniqueName: \"kubernetes.io/projected/58b8e311-215d-4516-a9b1-3090d472a15b-kube-api-access-f2sr8\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.162181 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58b8e311-215d-4516-a9b1-3090d472a15b" (UID: "58b8e311-215d-4516-a9b1-3090d472a15b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.247082 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b8e311-215d-4516-a9b1-3090d472a15b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.598761 4759 generic.go:334] "Generic (PLEG): container finished" podID="58b8e311-215d-4516-a9b1-3090d472a15b" containerID="90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8" exitCode=0 Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.598812 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerDied","Data":"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8"} Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.598856 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65klv" event={"ID":"58b8e311-215d-4516-a9b1-3090d472a15b","Type":"ContainerDied","Data":"1b847331d9763f0377fca2a52ea0983ff9876d8fedfeaf4938a066d3267a8721"} Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.598874 4759 scope.go:117] "RemoveContainer" containerID="90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.599097 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65klv" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.650740 4759 scope.go:117] "RemoveContainer" containerID="c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.662053 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.675157 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-65klv"] Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.684464 4759 scope.go:117] "RemoveContainer" containerID="90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.731827 4759 scope.go:117] "RemoveContainer" containerID="90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8" Dec 05 01:01:33 crc kubenswrapper[4759]: E1205 01:01:33.732699 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8\": container with ID starting with 90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8 not found: ID does not exist" containerID="90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.732742 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8"} err="failed to get container status \"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8\": rpc error: code = NotFound desc = could not find container \"90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8\": container with ID starting with 90f6cf765a4ce14c52115c32db9a46c6a7fc097bbc7959087c96fd5b3512efc8 not found: ID does not exist" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.732770 4759 scope.go:117] "RemoveContainer" containerID="c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea" Dec 05 01:01:33 crc kubenswrapper[4759]: E1205 01:01:33.733213 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea\": container with ID starting with c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea not found: ID does not exist" containerID="c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.733244 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea"} err="failed to get container status \"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea\": rpc error: code = NotFound desc = could not find container \"c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea\": container with ID starting with c33accbb69fce087ac684fc28d8007385d70321516e7cb9ea0a0fc0a3172f1ea not found: ID does not exist" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.733265 4759 scope.go:117] "RemoveContainer" containerID="90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556" Dec 05 01:01:33 crc kubenswrapper[4759]: E1205 01:01:33.733618 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556\": container with ID starting with 90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556 not found: ID does not exist" containerID="90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556" Dec 05 01:01:33 crc kubenswrapper[4759]: I1205 01:01:33.733642 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556"} err="failed to get container status \"90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556\": rpc error: code = NotFound desc = could not find container \"90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556\": container with ID starting with 90462e9e005e2db5603a9b09d39e5d811152ea07ae8a40a982642fd084ab7556 not found: ID does not exist" Dec 05 01:01:35 crc kubenswrapper[4759]: I1205 01:01:35.181731 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" path="/var/lib/kubelet/pods/58b8e311-215d-4516-a9b1-3090d472a15b/volumes" Dec 05 01:01:47 crc kubenswrapper[4759]: I1205 01:01:47.155829 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:01:47 crc kubenswrapper[4759]: E1205 01:01:47.156728 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:02:00 crc kubenswrapper[4759]: I1205 01:02:00.156388 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:02:00 crc kubenswrapper[4759]: E1205 01:02:00.157196 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:02:14 crc kubenswrapper[4759]: I1205 01:02:14.156229 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:02:14 crc kubenswrapper[4759]: E1205 01:02:14.157487 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:02:29 crc kubenswrapper[4759]: I1205 01:02:29.156193 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:02:29 crc kubenswrapper[4759]: E1205 01:02:29.156962 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:02:40 crc kubenswrapper[4759]: I1205 01:02:40.155866 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:02:40 crc kubenswrapper[4759]: E1205 01:02:40.156993 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:02:54 crc kubenswrapper[4759]: I1205 01:02:54.156696 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:02:54 crc kubenswrapper[4759]: E1205 01:02:54.157666 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.239495 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:00 crc kubenswrapper[4759]: E1205 01:03:00.240672 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="extract-content" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.240693 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="extract-content" Dec 05 01:03:00 crc kubenswrapper[4759]: E1205 01:03:00.240708 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="registry-server" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.240717 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="registry-server" Dec 05 01:03:00 crc kubenswrapper[4759]: E1205 01:03:00.240749 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="extract-utilities" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.240759 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="extract-utilities" Dec 05 01:03:00 crc kubenswrapper[4759]: E1205 01:03:00.240779 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411ce212-9655-4a4e-8056-adcbaf433178" containerName="keystone-cron" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.240789 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="411ce212-9655-4a4e-8056-adcbaf433178" containerName="keystone-cron" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.241089 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b8e311-215d-4516-a9b1-3090d472a15b" containerName="registry-server" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.241110 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="411ce212-9655-4a4e-8056-adcbaf433178" containerName="keystone-cron" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.242978 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.265696 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.400648 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.400780 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbsmc\" (UniqueName: \"kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.401110 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.503761 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.503842 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbsmc\" (UniqueName: \"kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.503967 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.504409 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.504490 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.531903 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbsmc\" (UniqueName: \"kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc\") pod \"redhat-operators-fr27l\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:00 crc kubenswrapper[4759]: I1205 01:03:00.574286 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:01 crc kubenswrapper[4759]: I1205 01:03:01.105642 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:01 crc kubenswrapper[4759]: I1205 01:03:01.703448 4759 generic.go:334] "Generic (PLEG): container finished" podID="f5515353-5267-4213-aab1-70de405a163f" containerID="2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b" exitCode=0 Dec 05 01:03:01 crc kubenswrapper[4759]: I1205 01:03:01.703551 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerDied","Data":"2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b"} Dec 05 01:03:01 crc kubenswrapper[4759]: I1205 01:03:01.703700 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerStarted","Data":"18f1ab8647dbf0326dad6e2d9fbbb17f7cea68cb007adcfda19b800064996e7b"} Dec 05 01:03:02 crc kubenswrapper[4759]: I1205 01:03:02.715068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerStarted","Data":"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253"} Dec 05 01:03:07 crc kubenswrapper[4759]: I1205 01:03:07.784279 4759 generic.go:334] "Generic (PLEG): container finished" podID="f5515353-5267-4213-aab1-70de405a163f" containerID="76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253" exitCode=0 Dec 05 01:03:07 crc kubenswrapper[4759]: I1205 01:03:07.785053 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerDied","Data":"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253"} Dec 05 01:03:08 crc kubenswrapper[4759]: I1205 01:03:08.156429 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:03:08 crc kubenswrapper[4759]: E1205 01:03:08.156910 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:03:08 crc kubenswrapper[4759]: I1205 01:03:08.800203 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerStarted","Data":"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106"} Dec 05 01:03:08 crc kubenswrapper[4759]: I1205 01:03:08.832103 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fr27l" podStartSLOduration=2.272309782 podStartE2EDuration="8.832076603s" podCreationTimestamp="2025-12-05 01:03:00 +0000 UTC" firstStartedPulling="2025-12-05 01:03:01.706978341 +0000 UTC m=+2400.922639291" lastFinishedPulling="2025-12-05 01:03:08.266745152 +0000 UTC m=+2407.482406112" observedRunningTime="2025-12-05 01:03:08.824363564 +0000 UTC m=+2408.040024534" watchObservedRunningTime="2025-12-05 01:03:08.832076603 +0000 UTC m=+2408.047737593" Dec 05 01:03:10 crc kubenswrapper[4759]: I1205 01:03:10.575791 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:10 crc kubenswrapper[4759]: I1205 01:03:10.576204 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:11 crc kubenswrapper[4759]: I1205 01:03:11.664173 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fr27l" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="registry-server" probeResult="failure" output=< Dec 05 01:03:11 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:03:11 crc kubenswrapper[4759]: > Dec 05 01:03:20 crc kubenswrapper[4759]: I1205 01:03:20.662468 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:20 crc kubenswrapper[4759]: I1205 01:03:20.753231 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:20 crc kubenswrapper[4759]: I1205 01:03:20.925584 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:21 crc kubenswrapper[4759]: I1205 01:03:21.960056 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fr27l" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="registry-server" containerID="cri-o://90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106" gracePeriod=2 Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.156825 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:03:22 crc kubenswrapper[4759]: E1205 01:03:22.157075 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.511945 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.672813 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities\") pod \"f5515353-5267-4213-aab1-70de405a163f\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.672864 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbsmc\" (UniqueName: \"kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc\") pod \"f5515353-5267-4213-aab1-70de405a163f\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.672998 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content\") pod \"f5515353-5267-4213-aab1-70de405a163f\" (UID: \"f5515353-5267-4213-aab1-70de405a163f\") " Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.674092 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities" (OuterVolumeSpecName: "utilities") pod "f5515353-5267-4213-aab1-70de405a163f" (UID: "f5515353-5267-4213-aab1-70de405a163f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.679043 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc" (OuterVolumeSpecName: "kube-api-access-tbsmc") pod "f5515353-5267-4213-aab1-70de405a163f" (UID: "f5515353-5267-4213-aab1-70de405a163f"). InnerVolumeSpecName "kube-api-access-tbsmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.775638 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.775670 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbsmc\" (UniqueName: \"kubernetes.io/projected/f5515353-5267-4213-aab1-70de405a163f-kube-api-access-tbsmc\") on node \"crc\" DevicePath \"\"" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.777675 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5515353-5267-4213-aab1-70de405a163f" (UID: "f5515353-5267-4213-aab1-70de405a163f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.877205 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5515353-5267-4213-aab1-70de405a163f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.980984 4759 generic.go:334] "Generic (PLEG): container finished" podID="f5515353-5267-4213-aab1-70de405a163f" containerID="90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106" exitCode=0 Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.981051 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerDied","Data":"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106"} Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.981129 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr27l" event={"ID":"f5515353-5267-4213-aab1-70de405a163f","Type":"ContainerDied","Data":"18f1ab8647dbf0326dad6e2d9fbbb17f7cea68cb007adcfda19b800064996e7b"} Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.981140 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr27l" Dec 05 01:03:22 crc kubenswrapper[4759]: I1205 01:03:22.981160 4759 scope.go:117] "RemoveContainer" containerID="90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.032600 4759 scope.go:117] "RemoveContainer" containerID="76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.049493 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.066411 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fr27l"] Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.080217 4759 scope.go:117] "RemoveContainer" containerID="2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.132732 4759 scope.go:117] "RemoveContainer" containerID="90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106" Dec 05 01:03:23 crc kubenswrapper[4759]: E1205 01:03:23.133504 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106\": container with ID starting with 90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106 not found: ID does not exist" containerID="90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.133555 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106"} err="failed to get container status \"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106\": rpc error: code = NotFound desc = could not find container \"90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106\": container with ID starting with 90a7bc477bcfb1e6d8313bf2799394e00e096e51e3f1a430d15c6619f5e9c106 not found: ID does not exist" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.133585 4759 scope.go:117] "RemoveContainer" containerID="76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253" Dec 05 01:03:23 crc kubenswrapper[4759]: E1205 01:03:23.134051 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253\": container with ID starting with 76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253 not found: ID does not exist" containerID="76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.134095 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253"} err="failed to get container status \"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253\": rpc error: code = NotFound desc = could not find container \"76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253\": container with ID starting with 76cce602eca0a21973b6705397e29b96af65001238c48a64d1da595dd6b7e253 not found: ID does not exist" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.134117 4759 scope.go:117] "RemoveContainer" containerID="2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b" Dec 05 01:03:23 crc kubenswrapper[4759]: E1205 01:03:23.134734 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b\": container with ID starting with 2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b not found: ID does not exist" containerID="2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.134765 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b"} err="failed to get container status \"2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b\": rpc error: code = NotFound desc = could not find container \"2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b\": container with ID starting with 2f89aea1dac3eb503b8d556c9b05b5b0ba18bfafac84c282503771a7be776e7b not found: ID does not exist" Dec 05 01:03:23 crc kubenswrapper[4759]: I1205 01:03:23.179929 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5515353-5267-4213-aab1-70de405a163f" path="/var/lib/kubelet/pods/f5515353-5267-4213-aab1-70de405a163f/volumes" Dec 05 01:03:37 crc kubenswrapper[4759]: I1205 01:03:37.156097 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:03:37 crc kubenswrapper[4759]: E1205 01:03:37.156914 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:03:49 crc kubenswrapper[4759]: I1205 01:03:49.158762 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:03:49 crc kubenswrapper[4759]: E1205 01:03:49.162302 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:04:01 crc kubenswrapper[4759]: I1205 01:04:01.165417 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:04:01 crc kubenswrapper[4759]: E1205 01:04:01.166514 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:04:13 crc kubenswrapper[4759]: I1205 01:04:13.155821 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:04:13 crc kubenswrapper[4759]: E1205 01:04:13.156797 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:04:26 crc kubenswrapper[4759]: I1205 01:04:26.155478 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:04:26 crc kubenswrapper[4759]: E1205 01:04:26.156342 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:04:39 crc kubenswrapper[4759]: I1205 01:04:39.156446 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:04:40 crc kubenswrapper[4759]: I1205 01:04:40.010321 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8"} Dec 05 01:05:13 crc kubenswrapper[4759]: I1205 01:05:13.478356 4759 generic.go:334] "Generic (PLEG): container finished" podID="c3b3d06e-6304-425d-b688-524cfbf7ea5a" containerID="c801697c7f41025dc5e6dd5f0c573bfc35f754dce7df3f2a04fffd73b4580b53" exitCode=0 Dec 05 01:05:13 crc kubenswrapper[4759]: I1205 01:05:13.478490 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" event={"ID":"c3b3d06e-6304-425d-b688-524cfbf7ea5a","Type":"ContainerDied","Data":"c801697c7f41025dc5e6dd5f0c573bfc35f754dce7df3f2a04fffd73b4580b53"} Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.034983 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.129953 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle\") pod \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.130123 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory\") pod \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.130188 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj6zw\" (UniqueName: \"kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw\") pod \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.130209 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key\") pod \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.130241 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0\") pod \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\" (UID: \"c3b3d06e-6304-425d-b688-524cfbf7ea5a\") " Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.137828 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c3b3d06e-6304-425d-b688-524cfbf7ea5a" (UID: "c3b3d06e-6304-425d-b688-524cfbf7ea5a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.139574 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw" (OuterVolumeSpecName: "kube-api-access-xj6zw") pod "c3b3d06e-6304-425d-b688-524cfbf7ea5a" (UID: "c3b3d06e-6304-425d-b688-524cfbf7ea5a"). InnerVolumeSpecName "kube-api-access-xj6zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.161836 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory" (OuterVolumeSpecName: "inventory") pod "c3b3d06e-6304-425d-b688-524cfbf7ea5a" (UID: "c3b3d06e-6304-425d-b688-524cfbf7ea5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.170243 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3b3d06e-6304-425d-b688-524cfbf7ea5a" (UID: "c3b3d06e-6304-425d-b688-524cfbf7ea5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.178985 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c3b3d06e-6304-425d-b688-524cfbf7ea5a" (UID: "c3b3d06e-6304-425d-b688-524cfbf7ea5a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.232979 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.233047 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.233063 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj6zw\" (UniqueName: \"kubernetes.io/projected/c3b3d06e-6304-425d-b688-524cfbf7ea5a-kube-api-access-xj6zw\") on node \"crc\" DevicePath \"\"" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.233075 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.233089 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c3b3d06e-6304-425d-b688-524cfbf7ea5a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.505676 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" event={"ID":"c3b3d06e-6304-425d-b688-524cfbf7ea5a","Type":"ContainerDied","Data":"20528f14dd0cf06313742969b6d60efa518969ed71b12f1952fa326efdad763d"} Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.505953 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20528f14dd0cf06313742969b6d60efa518969ed71b12f1952fa326efdad763d" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.505791 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.627441 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f"] Dec 05 01:05:15 crc kubenswrapper[4759]: E1205 01:05:15.628203 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b3d06e-6304-425d-b688-524cfbf7ea5a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.628465 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b3d06e-6304-425d-b688-524cfbf7ea5a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:05:15 crc kubenswrapper[4759]: E1205 01:05:15.628584 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="registry-server" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.628659 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="registry-server" Dec 05 01:05:15 crc kubenswrapper[4759]: E1205 01:05:15.628810 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="extract-utilities" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.628932 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="extract-utilities" Dec 05 01:05:15 crc kubenswrapper[4759]: E1205 01:05:15.629028 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="extract-content" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.629107 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="extract-content" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.629471 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b3d06e-6304-425d-b688-524cfbf7ea5a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.629571 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5515353-5267-4213-aab1-70de405a163f" containerName="registry-server" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.630619 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.633299 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.633798 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.634102 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.634402 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.634656 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.638438 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f"] Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.751967 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.752022 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.752072 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.752407 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.752547 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2mr\" (UniqueName: \"kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.752871 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.753033 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854639 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854718 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854785 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854822 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2mr\" (UniqueName: \"kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854881 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854905 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.854941 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.860325 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.861808 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.862157 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.863041 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.863273 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.866516 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.885947 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2mr\" (UniqueName: \"kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-db89f\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:15 crc kubenswrapper[4759]: I1205 01:05:15.985347 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:05:16 crc kubenswrapper[4759]: I1205 01:05:16.579854 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f"] Dec 05 01:05:17 crc kubenswrapper[4759]: I1205 01:05:17.530956 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" event={"ID":"a691f6a0-4a00-491e-ad04-32f31f8dc175","Type":"ContainerStarted","Data":"f71bffd0bd259e97d4cb2adb2f2b8c5c7f20d86681f72ccec2d14892d6afb141"} Dec 05 01:05:17 crc kubenswrapper[4759]: I1205 01:05:17.531540 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" event={"ID":"a691f6a0-4a00-491e-ad04-32f31f8dc175","Type":"ContainerStarted","Data":"54bd6985dacd859679e03723b5729be84b627e8354e56b9100f0e7a7ec21a1d1"} Dec 05 01:05:17 crc kubenswrapper[4759]: I1205 01:05:17.550754 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" podStartSLOduration=2.074630031 podStartE2EDuration="2.55073772s" podCreationTimestamp="2025-12-05 01:05:15 +0000 UTC" firstStartedPulling="2025-12-05 01:05:16.589646264 +0000 UTC m=+2535.805307214" lastFinishedPulling="2025-12-05 01:05:17.065753913 +0000 UTC m=+2536.281414903" observedRunningTime="2025-12-05 01:05:17.549793626 +0000 UTC m=+2536.765454596" watchObservedRunningTime="2025-12-05 01:05:17.55073772 +0000 UTC m=+2536.766398670" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.832205 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.835040 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.884209 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.896478 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5sx\" (UniqueName: \"kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.896631 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.896679 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.998737 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.998804 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.998912 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5sx\" (UniqueName: \"kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.999335 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:39 crc kubenswrapper[4759]: I1205 01:06:39.999410 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:40 crc kubenswrapper[4759]: I1205 01:06:40.047189 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5sx\" (UniqueName: \"kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx\") pod \"certified-operators-r9f8v\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:40 crc kubenswrapper[4759]: I1205 01:06:40.181632 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:40 crc kubenswrapper[4759]: I1205 01:06:40.710525 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:40 crc kubenswrapper[4759]: I1205 01:06:40.810086 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerStarted","Data":"ad858fd8582218616a1f9e599c8c15fc961aaf9917a234555095725e1b854de1"} Dec 05 01:06:41 crc kubenswrapper[4759]: I1205 01:06:41.831331 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerID="33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb" exitCode=0 Dec 05 01:06:41 crc kubenswrapper[4759]: I1205 01:06:41.831374 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerDied","Data":"33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb"} Dec 05 01:06:41 crc kubenswrapper[4759]: I1205 01:06:41.833978 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:06:43 crc kubenswrapper[4759]: I1205 01:06:43.862876 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerID="51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee" exitCode=0 Dec 05 01:06:43 crc kubenswrapper[4759]: I1205 01:06:43.862991 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerDied","Data":"51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee"} Dec 05 01:06:44 crc kubenswrapper[4759]: I1205 01:06:44.878537 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerStarted","Data":"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af"} Dec 05 01:06:44 crc kubenswrapper[4759]: I1205 01:06:44.940944 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r9f8v" podStartSLOduration=3.359255607 podStartE2EDuration="5.940926827s" podCreationTimestamp="2025-12-05 01:06:39 +0000 UTC" firstStartedPulling="2025-12-05 01:06:41.833607215 +0000 UTC m=+2621.049268175" lastFinishedPulling="2025-12-05 01:06:44.415278445 +0000 UTC m=+2623.630939395" observedRunningTime="2025-12-05 01:06:44.935411772 +0000 UTC m=+2624.151072732" watchObservedRunningTime="2025-12-05 01:06:44.940926827 +0000 UTC m=+2624.156587777" Dec 05 01:06:50 crc kubenswrapper[4759]: I1205 01:06:50.182370 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:50 crc kubenswrapper[4759]: I1205 01:06:50.183146 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:50 crc kubenswrapper[4759]: I1205 01:06:50.261189 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:51 crc kubenswrapper[4759]: I1205 01:06:51.034531 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:51 crc kubenswrapper[4759]: I1205 01:06:51.101008 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:52 crc kubenswrapper[4759]: I1205 01:06:52.981544 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r9f8v" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="registry-server" containerID="cri-o://e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af" gracePeriod=2 Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.637023 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.742710 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content\") pod \"8a80d917-fc47-4e7e-9aae-b752616833e1\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.742799 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x5sx\" (UniqueName: \"kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx\") pod \"8a80d917-fc47-4e7e-9aae-b752616833e1\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.742913 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities\") pod \"8a80d917-fc47-4e7e-9aae-b752616833e1\" (UID: \"8a80d917-fc47-4e7e-9aae-b752616833e1\") " Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.743785 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities" (OuterVolumeSpecName: "utilities") pod "8a80d917-fc47-4e7e-9aae-b752616833e1" (UID: "8a80d917-fc47-4e7e-9aae-b752616833e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.752937 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx" (OuterVolumeSpecName: "kube-api-access-5x5sx") pod "8a80d917-fc47-4e7e-9aae-b752616833e1" (UID: "8a80d917-fc47-4e7e-9aae-b752616833e1"). InnerVolumeSpecName "kube-api-access-5x5sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.809927 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a80d917-fc47-4e7e-9aae-b752616833e1" (UID: "8a80d917-fc47-4e7e-9aae-b752616833e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.845172 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.845211 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a80d917-fc47-4e7e-9aae-b752616833e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.845229 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x5sx\" (UniqueName: \"kubernetes.io/projected/8a80d917-fc47-4e7e-9aae-b752616833e1-kube-api-access-5x5sx\") on node \"crc\" DevicePath \"\"" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.995131 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerID="e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af" exitCode=0 Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.995201 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9f8v" Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.995192 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerDied","Data":"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af"} Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.995381 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9f8v" event={"ID":"8a80d917-fc47-4e7e-9aae-b752616833e1","Type":"ContainerDied","Data":"ad858fd8582218616a1f9e599c8c15fc961aaf9917a234555095725e1b854de1"} Dec 05 01:06:53 crc kubenswrapper[4759]: I1205 01:06:53.995414 4759 scope.go:117] "RemoveContainer" containerID="e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.037300 4759 scope.go:117] "RemoveContainer" containerID="51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.066051 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.090914 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r9f8v"] Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.119665 4759 scope.go:117] "RemoveContainer" containerID="33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.211107 4759 scope.go:117] "RemoveContainer" containerID="e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af" Dec 05 01:06:54 crc kubenswrapper[4759]: E1205 01:06:54.216750 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af\": container with ID starting with e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af not found: ID does not exist" containerID="e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.216810 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af"} err="failed to get container status \"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af\": rpc error: code = NotFound desc = could not find container \"e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af\": container with ID starting with e0a854f8e69e13dd3c741bd3f466da68e54e313cbd96e473ea6cc11993c0f0af not found: ID does not exist" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.216838 4759 scope.go:117] "RemoveContainer" containerID="51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee" Dec 05 01:06:54 crc kubenswrapper[4759]: E1205 01:06:54.221560 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee\": container with ID starting with 51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee not found: ID does not exist" containerID="51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.221608 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee"} err="failed to get container status \"51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee\": rpc error: code = NotFound desc = could not find container \"51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee\": container with ID starting with 51c8c7172d1285cd978a6bd647b46a033dbe2543fe9e6ba4038e1abcea037eee not found: ID does not exist" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.221632 4759 scope.go:117] "RemoveContainer" containerID="33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb" Dec 05 01:06:54 crc kubenswrapper[4759]: E1205 01:06:54.222188 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb\": container with ID starting with 33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb not found: ID does not exist" containerID="33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb" Dec 05 01:06:54 crc kubenswrapper[4759]: I1205 01:06:54.222221 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb"} err="failed to get container status \"33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb\": rpc error: code = NotFound desc = could not find container \"33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb\": container with ID starting with 33b9076a3cfd0bcdb4726d74c6c62ee37e27cf41db7a74760e29f178aae0b6cb not found: ID does not exist" Dec 05 01:06:55 crc kubenswrapper[4759]: I1205 01:06:55.177014 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" path="/var/lib/kubelet/pods/8a80d917-fc47-4e7e-9aae-b752616833e1/volumes" Dec 05 01:07:04 crc kubenswrapper[4759]: I1205 01:07:04.433495 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:07:04 crc kubenswrapper[4759]: I1205 01:07:04.436387 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:07:34 crc kubenswrapper[4759]: I1205 01:07:34.433953 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:07:34 crc kubenswrapper[4759]: I1205 01:07:34.434625 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:07:58 crc kubenswrapper[4759]: I1205 01:07:58.781726 4759 generic.go:334] "Generic (PLEG): container finished" podID="a691f6a0-4a00-491e-ad04-32f31f8dc175" containerID="f71bffd0bd259e97d4cb2adb2f2b8c5c7f20d86681f72ccec2d14892d6afb141" exitCode=0 Dec 05 01:07:58 crc kubenswrapper[4759]: I1205 01:07:58.781947 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" event={"ID":"a691f6a0-4a00-491e-ad04-32f31f8dc175","Type":"ContainerDied","Data":"f71bffd0bd259e97d4cb2adb2f2b8c5c7f20d86681f72ccec2d14892d6afb141"} Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.330124 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501619 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501726 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501768 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501791 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501817 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2mr\" (UniqueName: \"kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.501977 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.502017 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2\") pod \"a691f6a0-4a00-491e-ad04-32f31f8dc175\" (UID: \"a691f6a0-4a00-491e-ad04-32f31f8dc175\") " Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.507093 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.514702 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr" (OuterVolumeSpecName: "kube-api-access-cp2mr") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "kube-api-access-cp2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.542800 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory" (OuterVolumeSpecName: "inventory") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.542907 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.545197 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.553719 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.568370 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a691f6a0-4a00-491e-ad04-32f31f8dc175" (UID: "a691f6a0-4a00-491e-ad04-32f31f8dc175"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604367 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604408 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604422 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604436 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604452 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2mr\" (UniqueName: \"kubernetes.io/projected/a691f6a0-4a00-491e-ad04-32f31f8dc175-kube-api-access-cp2mr\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604468 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.604479 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a691f6a0-4a00-491e-ad04-32f31f8dc175-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.805197 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" event={"ID":"a691f6a0-4a00-491e-ad04-32f31f8dc175","Type":"ContainerDied","Data":"54bd6985dacd859679e03723b5729be84b627e8354e56b9100f0e7a7ec21a1d1"} Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.805237 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54bd6985dacd859679e03723b5729be84b627e8354e56b9100f0e7a7ec21a1d1" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.805253 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.938960 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh"] Dec 05 01:08:00 crc kubenswrapper[4759]: E1205 01:08:00.939790 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="extract-utilities" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.939809 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="extract-utilities" Dec 05 01:08:00 crc kubenswrapper[4759]: E1205 01:08:00.939831 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a691f6a0-4a00-491e-ad04-32f31f8dc175" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.939838 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a691f6a0-4a00-491e-ad04-32f31f8dc175" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:08:00 crc kubenswrapper[4759]: E1205 01:08:00.939860 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="extract-content" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.939865 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="extract-content" Dec 05 01:08:00 crc kubenswrapper[4759]: E1205 01:08:00.939909 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="registry-server" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.939935 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="registry-server" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.940131 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a80d917-fc47-4e7e-9aae-b752616833e1" containerName="registry-server" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.940144 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a691f6a0-4a00-491e-ad04-32f31f8dc175" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.940935 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.948866 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.949028 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.949183 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.949293 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.949416 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:08:00 crc kubenswrapper[4759]: I1205 01:08:00.951348 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh"] Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014238 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014336 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014367 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014572 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014594 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014610 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzknb\" (UniqueName: \"kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.014629 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.116707 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117097 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117123 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzknb\" (UniqueName: \"kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117153 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117250 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117290 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.117337 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.123577 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.123750 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.123745 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.126869 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.148866 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.149331 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.154376 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.155816 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzknb\" (UniqueName: \"kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.157815 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.158719 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.283413 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.291545 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:08:01 crc kubenswrapper[4759]: I1205 01:08:01.878702 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh"] Dec 05 01:08:02 crc kubenswrapper[4759]: I1205 01:08:02.436744 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:08:02 crc kubenswrapper[4759]: I1205 01:08:02.828742 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" event={"ID":"bb51317e-dc76-4d08-af63-d4719ab711d9","Type":"ContainerStarted","Data":"9203bed4864fb03e0cb1be01cc327b1b9e6cf1e55c83a28b6aa9d283ea03596c"} Dec 05 01:08:02 crc kubenswrapper[4759]: I1205 01:08:02.829072 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" event={"ID":"bb51317e-dc76-4d08-af63-d4719ab711d9","Type":"ContainerStarted","Data":"624ebb2dd2c749a12da31a11c9ead31a01d53c8e5b3fb0bae3a8c2de5c796e53"} Dec 05 01:08:02 crc kubenswrapper[4759]: I1205 01:08:02.866051 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" podStartSLOduration=2.311059184 podStartE2EDuration="2.866032274s" podCreationTimestamp="2025-12-05 01:08:00 +0000 UTC" firstStartedPulling="2025-12-05 01:08:01.879461604 +0000 UTC m=+2701.095122554" lastFinishedPulling="2025-12-05 01:08:02.434434694 +0000 UTC m=+2701.650095644" observedRunningTime="2025-12-05 01:08:02.854741608 +0000 UTC m=+2702.070402578" watchObservedRunningTime="2025-12-05 01:08:02.866032274 +0000 UTC m=+2702.081693224" Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.432960 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.433589 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.433754 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.435897 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.436116 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8" gracePeriod=600 Dec 05 01:08:04 crc kubenswrapper[4759]: E1205 01:08:04.657251 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879c79ed_3fea_4896_84a5_e3c44d13a0c6.slice/crio-006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879c79ed_3fea_4896_84a5_e3c44d13a0c6.slice/crio-conmon-006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.855271 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8" exitCode=0 Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.855340 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8"} Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.855714 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade"} Dec 05 01:08:04 crc kubenswrapper[4759]: I1205 01:08:04.855746 4759 scope.go:117] "RemoveContainer" containerID="c9b652356f4fb7488e9d614c8b67098e2820ac5e96840557222ff158402bbe03" Dec 05 01:09:41 crc kubenswrapper[4759]: I1205 01:09:41.924911 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:41 crc kubenswrapper[4759]: I1205 01:09:41.929715 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:41 crc kubenswrapper[4759]: I1205 01:09:41.946222 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.054002 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tvz\" (UniqueName: \"kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.054761 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.055031 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.156748 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tvz\" (UniqueName: \"kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.157279 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.157510 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.158334 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.158348 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.188098 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tvz\" (UniqueName: \"kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz\") pod \"community-operators-hgctw\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.279803 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:42 crc kubenswrapper[4759]: I1205 01:09:42.831923 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:43 crc kubenswrapper[4759]: I1205 01:09:43.235803 4759 generic.go:334] "Generic (PLEG): container finished" podID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerID="1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be" exitCode=0 Dec 05 01:09:43 crc kubenswrapper[4759]: I1205 01:09:43.235891 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerDied","Data":"1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be"} Dec 05 01:09:43 crc kubenswrapper[4759]: I1205 01:09:43.237269 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerStarted","Data":"bbddb312c9da9703521d100f259e4c779e506c385e05d2eec0196b154ebcf767"} Dec 05 01:09:44 crc kubenswrapper[4759]: I1205 01:09:44.249666 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerStarted","Data":"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11"} Dec 05 01:09:45 crc kubenswrapper[4759]: I1205 01:09:45.276713 4759 generic.go:334] "Generic (PLEG): container finished" podID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerID="ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11" exitCode=0 Dec 05 01:09:45 crc kubenswrapper[4759]: I1205 01:09:45.276787 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerDied","Data":"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11"} Dec 05 01:09:46 crc kubenswrapper[4759]: I1205 01:09:46.309664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerStarted","Data":"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b"} Dec 05 01:09:46 crc kubenswrapper[4759]: I1205 01:09:46.339839 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgctw" podStartSLOduration=2.875618516 podStartE2EDuration="5.339813374s" podCreationTimestamp="2025-12-05 01:09:41 +0000 UTC" firstStartedPulling="2025-12-05 01:09:43.238380037 +0000 UTC m=+2802.454041037" lastFinishedPulling="2025-12-05 01:09:45.702574905 +0000 UTC m=+2804.918235895" observedRunningTime="2025-12-05 01:09:46.330991968 +0000 UTC m=+2805.546652998" watchObservedRunningTime="2025-12-05 01:09:46.339813374 +0000 UTC m=+2805.555474354" Dec 05 01:09:52 crc kubenswrapper[4759]: I1205 01:09:52.280899 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:52 crc kubenswrapper[4759]: I1205 01:09:52.281429 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:52 crc kubenswrapper[4759]: I1205 01:09:52.346943 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:52 crc kubenswrapper[4759]: I1205 01:09:52.436901 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:52 crc kubenswrapper[4759]: I1205 01:09:52.595063 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:54 crc kubenswrapper[4759]: I1205 01:09:54.409280 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgctw" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="registry-server" containerID="cri-o://de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b" gracePeriod=2 Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.067056 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.191421 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5tvz\" (UniqueName: \"kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz\") pod \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.191586 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities\") pod \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.191669 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content\") pod \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\" (UID: \"36fc0d18-fe88-4a14-ad1c-42a7fc66377d\") " Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.192774 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities" (OuterVolumeSpecName: "utilities") pod "36fc0d18-fe88-4a14-ad1c-42a7fc66377d" (UID: "36fc0d18-fe88-4a14-ad1c-42a7fc66377d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.199695 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz" (OuterVolumeSpecName: "kube-api-access-m5tvz") pod "36fc0d18-fe88-4a14-ad1c-42a7fc66377d" (UID: "36fc0d18-fe88-4a14-ad1c-42a7fc66377d"). InnerVolumeSpecName "kube-api-access-m5tvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.279126 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36fc0d18-fe88-4a14-ad1c-42a7fc66377d" (UID: "36fc0d18-fe88-4a14-ad1c-42a7fc66377d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.295017 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.295056 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.295071 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5tvz\" (UniqueName: \"kubernetes.io/projected/36fc0d18-fe88-4a14-ad1c-42a7fc66377d-kube-api-access-m5tvz\") on node \"crc\" DevicePath \"\"" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.426821 4759 generic.go:334] "Generic (PLEG): container finished" podID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerID="de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b" exitCode=0 Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.426874 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerDied","Data":"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b"} Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.426970 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgctw" event={"ID":"36fc0d18-fe88-4a14-ad1c-42a7fc66377d","Type":"ContainerDied","Data":"bbddb312c9da9703521d100f259e4c779e506c385e05d2eec0196b154ebcf767"} Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.426939 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgctw" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.427009 4759 scope.go:117] "RemoveContainer" containerID="de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.481524 4759 scope.go:117] "RemoveContainer" containerID="ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.501476 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.514035 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgctw"] Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.519678 4759 scope.go:117] "RemoveContainer" containerID="1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.556116 4759 scope.go:117] "RemoveContainer" containerID="de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b" Dec 05 01:09:55 crc kubenswrapper[4759]: E1205 01:09:55.556743 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b\": container with ID starting with de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b not found: ID does not exist" containerID="de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.556796 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b"} err="failed to get container status \"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b\": rpc error: code = NotFound desc = could not find container \"de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b\": container with ID starting with de5f9a13fd7a8c5ba20c008da67a910e4f785212618a5b75ea7c31588b66b93b not found: ID does not exist" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.556826 4759 scope.go:117] "RemoveContainer" containerID="ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11" Dec 05 01:09:55 crc kubenswrapper[4759]: E1205 01:09:55.557132 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11\": container with ID starting with ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11 not found: ID does not exist" containerID="ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.557172 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11"} err="failed to get container status \"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11\": rpc error: code = NotFound desc = could not find container \"ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11\": container with ID starting with ca44b5a2e70248e6d9d495d8a620f944413eb0043af23ed264850c551933bf11 not found: ID does not exist" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.557221 4759 scope.go:117] "RemoveContainer" containerID="1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be" Dec 05 01:09:55 crc kubenswrapper[4759]: E1205 01:09:55.557582 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be\": container with ID starting with 1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be not found: ID does not exist" containerID="1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be" Dec 05 01:09:55 crc kubenswrapper[4759]: I1205 01:09:55.557617 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be"} err="failed to get container status \"1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be\": rpc error: code = NotFound desc = could not find container \"1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be\": container with ID starting with 1efbbce6a801bf39e226fd37128a859259c4925aa5e2aed9fa242d64e4af79be not found: ID does not exist" Dec 05 01:09:57 crc kubenswrapper[4759]: I1205 01:09:57.171785 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" path="/var/lib/kubelet/pods/36fc0d18-fe88-4a14-ad1c-42a7fc66377d/volumes" Dec 05 01:10:04 crc kubenswrapper[4759]: I1205 01:10:04.434096 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:10:04 crc kubenswrapper[4759]: I1205 01:10:04.434895 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:10:26 crc kubenswrapper[4759]: I1205 01:10:26.857467 4759 generic.go:334] "Generic (PLEG): container finished" podID="bb51317e-dc76-4d08-af63-d4719ab711d9" containerID="9203bed4864fb03e0cb1be01cc327b1b9e6cf1e55c83a28b6aa9d283ea03596c" exitCode=0 Dec 05 01:10:26 crc kubenswrapper[4759]: I1205 01:10:26.857575 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" event={"ID":"bb51317e-dc76-4d08-af63-d4719ab711d9","Type":"ContainerDied","Data":"9203bed4864fb03e0cb1be01cc327b1b9e6cf1e55c83a28b6aa9d283ea03596c"} Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.345583 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.464337 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.464716 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzknb\" (UniqueName: \"kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.464946 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.464988 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.465043 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.465123 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.465181 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1\") pod \"bb51317e-dc76-4d08-af63-d4719ab711d9\" (UID: \"bb51317e-dc76-4d08-af63-d4719ab711d9\") " Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.480560 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.483508 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb" (OuterVolumeSpecName: "kube-api-access-rzknb") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "kube-api-access-rzknb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.496479 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.498741 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.503107 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.506505 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory" (OuterVolumeSpecName: "inventory") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.507363 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "bb51317e-dc76-4d08-af63-d4719ab711d9" (UID: "bb51317e-dc76-4d08-af63-d4719ab711d9"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567868 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567899 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567914 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567927 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzknb\" (UniqueName: \"kubernetes.io/projected/bb51317e-dc76-4d08-af63-d4719ab711d9-kube-api-access-rzknb\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567939 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567953 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.567968 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb51317e-dc76-4d08-af63-d4719ab711d9-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.877536 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" event={"ID":"bb51317e-dc76-4d08-af63-d4719ab711d9","Type":"ContainerDied","Data":"624ebb2dd2c749a12da31a11c9ead31a01d53c8e5b3fb0bae3a8c2de5c796e53"} Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.877607 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="624ebb2dd2c749a12da31a11c9ead31a01d53c8e5b3fb0bae3a8c2de5c796e53" Dec 05 01:10:28 crc kubenswrapper[4759]: I1205 01:10:28.877627 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.016767 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm"] Dec 05 01:10:29 crc kubenswrapper[4759]: E1205 01:10:29.017345 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="extract-content" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017366 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="extract-content" Dec 05 01:10:29 crc kubenswrapper[4759]: E1205 01:10:29.017408 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="registry-server" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017417 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="registry-server" Dec 05 01:10:29 crc kubenswrapper[4759]: E1205 01:10:29.017440 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="extract-utilities" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017449 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="extract-utilities" Dec 05 01:10:29 crc kubenswrapper[4759]: E1205 01:10:29.017476 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb51317e-dc76-4d08-af63-d4719ab711d9" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017487 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb51317e-dc76-4d08-af63-d4719ab711d9" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017732 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb51317e-dc76-4d08-af63-d4719ab711d9" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.017775 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fc0d18-fe88-4a14-ad1c-42a7fc66377d" containerName="registry-server" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.018716 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.020864 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.021525 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.021794 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.021973 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.022065 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.026080 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm"] Dec 05 01:10:29 crc kubenswrapper[4759]: E1205 01:10:29.137592 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb51317e_dc76_4d08_af63_d4719ab711d9.slice/crio-624ebb2dd2c749a12da31a11c9ead31a01d53c8e5b3fb0bae3a8c2de5c796e53\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb51317e_dc76_4d08_af63_d4719ab711d9.slice\": RecentStats: unable to find data in memory cache]" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.179986 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.180049 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.180121 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.180196 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4qs\" (UniqueName: \"kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.180573 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.282302 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.282596 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.282622 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.282664 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.282712 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4qs\" (UniqueName: \"kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.289658 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.290024 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.291541 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.300096 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.306586 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4qs\" (UniqueName: \"kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xx2hm\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.344205 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:29 crc kubenswrapper[4759]: I1205 01:10:29.957009 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm"] Dec 05 01:10:30 crc kubenswrapper[4759]: I1205 01:10:30.903526 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" event={"ID":"df9ccd57-fa3c-429c-a23c-306bf24a4515","Type":"ContainerStarted","Data":"e9c34d65eec1b4d7620a0b28265fd37acc6247ba0f45181b3f795e56b8a489be"} Dec 05 01:10:30 crc kubenswrapper[4759]: I1205 01:10:30.903953 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" event={"ID":"df9ccd57-fa3c-429c-a23c-306bf24a4515","Type":"ContainerStarted","Data":"d2c27e63c02d7e01977e1530385b3c846eed9a1a961cf524c16cdefa73116632"} Dec 05 01:10:30 crc kubenswrapper[4759]: I1205 01:10:30.926095 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" podStartSLOduration=2.418054983 podStartE2EDuration="2.926075194s" podCreationTimestamp="2025-12-05 01:10:28 +0000 UTC" firstStartedPulling="2025-12-05 01:10:29.962832756 +0000 UTC m=+2849.178493746" lastFinishedPulling="2025-12-05 01:10:30.470852997 +0000 UTC m=+2849.686513957" observedRunningTime="2025-12-05 01:10:30.924697601 +0000 UTC m=+2850.140358571" watchObservedRunningTime="2025-12-05 01:10:30.926075194 +0000 UTC m=+2850.141736164" Dec 05 01:10:34 crc kubenswrapper[4759]: I1205 01:10:34.433357 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:10:34 crc kubenswrapper[4759]: I1205 01:10:34.434119 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.053550 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.056478 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.083421 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.153141 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.153181 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.153427 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvpp\" (UniqueName: \"kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.255100 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvpp\" (UniqueName: \"kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.255285 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.255334 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.255828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.255909 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.277453 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvpp\" (UniqueName: \"kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp\") pod \"redhat-marketplace-lbl8v\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.394508 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:45 crc kubenswrapper[4759]: I1205 01:10:45.866639 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:46 crc kubenswrapper[4759]: I1205 01:10:46.095209 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerStarted","Data":"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348"} Dec 05 01:10:46 crc kubenswrapper[4759]: I1205 01:10:46.095247 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerStarted","Data":"35b3f7b729bfb85efb54b546b1ad80ca650536769fe593fd0365f170eb282010"} Dec 05 01:10:47 crc kubenswrapper[4759]: I1205 01:10:47.107715 4759 generic.go:334] "Generic (PLEG): container finished" podID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerID="190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348" exitCode=0 Dec 05 01:10:47 crc kubenswrapper[4759]: I1205 01:10:47.107872 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerDied","Data":"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348"} Dec 05 01:10:49 crc kubenswrapper[4759]: I1205 01:10:49.148808 4759 generic.go:334] "Generic (PLEG): container finished" podID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerID="d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b" exitCode=0 Dec 05 01:10:49 crc kubenswrapper[4759]: I1205 01:10:49.148862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerDied","Data":"d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b"} Dec 05 01:10:50 crc kubenswrapper[4759]: I1205 01:10:50.166252 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerStarted","Data":"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb"} Dec 05 01:10:50 crc kubenswrapper[4759]: I1205 01:10:50.197705 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbl8v" podStartSLOduration=2.742078584 podStartE2EDuration="5.197682042s" podCreationTimestamp="2025-12-05 01:10:45 +0000 UTC" firstStartedPulling="2025-12-05 01:10:47.111039147 +0000 UTC m=+2866.326700097" lastFinishedPulling="2025-12-05 01:10:49.566642605 +0000 UTC m=+2868.782303555" observedRunningTime="2025-12-05 01:10:50.189846849 +0000 UTC m=+2869.405507819" watchObservedRunningTime="2025-12-05 01:10:50.197682042 +0000 UTC m=+2869.413343012" Dec 05 01:10:52 crc kubenswrapper[4759]: I1205 01:10:52.220952 4759 generic.go:334] "Generic (PLEG): container finished" podID="df9ccd57-fa3c-429c-a23c-306bf24a4515" containerID="e9c34d65eec1b4d7620a0b28265fd37acc6247ba0f45181b3f795e56b8a489be" exitCode=0 Dec 05 01:10:52 crc kubenswrapper[4759]: I1205 01:10:52.221022 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" event={"ID":"df9ccd57-fa3c-429c-a23c-306bf24a4515","Type":"ContainerDied","Data":"e9c34d65eec1b4d7620a0b28265fd37acc6247ba0f45181b3f795e56b8a489be"} Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.732373 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.864782 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px4qs\" (UniqueName: \"kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs\") pod \"df9ccd57-fa3c-429c-a23c-306bf24a4515\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.864893 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0\") pod \"df9ccd57-fa3c-429c-a23c-306bf24a4515\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.864936 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory\") pod \"df9ccd57-fa3c-429c-a23c-306bf24a4515\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.865065 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key\") pod \"df9ccd57-fa3c-429c-a23c-306bf24a4515\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.865102 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1\") pod \"df9ccd57-fa3c-429c-a23c-306bf24a4515\" (UID: \"df9ccd57-fa3c-429c-a23c-306bf24a4515\") " Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.872571 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs" (OuterVolumeSpecName: "kube-api-access-px4qs") pod "df9ccd57-fa3c-429c-a23c-306bf24a4515" (UID: "df9ccd57-fa3c-429c-a23c-306bf24a4515"). InnerVolumeSpecName "kube-api-access-px4qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.902283 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "df9ccd57-fa3c-429c-a23c-306bf24a4515" (UID: "df9ccd57-fa3c-429c-a23c-306bf24a4515"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.907121 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "df9ccd57-fa3c-429c-a23c-306bf24a4515" (UID: "df9ccd57-fa3c-429c-a23c-306bf24a4515"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.926464 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory" (OuterVolumeSpecName: "inventory") pod "df9ccd57-fa3c-429c-a23c-306bf24a4515" (UID: "df9ccd57-fa3c-429c-a23c-306bf24a4515"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.927878 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "df9ccd57-fa3c-429c-a23c-306bf24a4515" (UID: "df9ccd57-fa3c-429c-a23c-306bf24a4515"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.968114 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.968208 4759 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.968243 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px4qs\" (UniqueName: \"kubernetes.io/projected/df9ccd57-fa3c-429c-a23c-306bf24a4515-kube-api-access-px4qs\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.968273 4759 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:53 crc kubenswrapper[4759]: I1205 01:10:53.968337 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df9ccd57-fa3c-429c-a23c-306bf24a4515-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:54 crc kubenswrapper[4759]: I1205 01:10:54.255651 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" event={"ID":"df9ccd57-fa3c-429c-a23c-306bf24a4515","Type":"ContainerDied","Data":"d2c27e63c02d7e01977e1530385b3c846eed9a1a961cf524c16cdefa73116632"} Dec 05 01:10:54 crc kubenswrapper[4759]: I1205 01:10:54.255738 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2c27e63c02d7e01977e1530385b3c846eed9a1a961cf524c16cdefa73116632" Dec 05 01:10:54 crc kubenswrapper[4759]: I1205 01:10:54.255813 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm" Dec 05 01:10:55 crc kubenswrapper[4759]: I1205 01:10:55.395609 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:55 crc kubenswrapper[4759]: I1205 01:10:55.395666 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:55 crc kubenswrapper[4759]: I1205 01:10:55.449496 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:56 crc kubenswrapper[4759]: I1205 01:10:56.354958 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:56 crc kubenswrapper[4759]: I1205 01:10:56.411149 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.305514 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbl8v" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="registry-server" containerID="cri-o://6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb" gracePeriod=2 Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.785879 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.878315 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities\") pod \"22be5575-57cd-4595-b3ad-b94ce3a00eef\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.879460 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content\") pod \"22be5575-57cd-4595-b3ad-b94ce3a00eef\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.879393 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities" (OuterVolumeSpecName: "utilities") pod "22be5575-57cd-4595-b3ad-b94ce3a00eef" (UID: "22be5575-57cd-4595-b3ad-b94ce3a00eef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.879527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdvpp\" (UniqueName: \"kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp\") pod \"22be5575-57cd-4595-b3ad-b94ce3a00eef\" (UID: \"22be5575-57cd-4595-b3ad-b94ce3a00eef\") " Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.884002 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.887207 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp" (OuterVolumeSpecName: "kube-api-access-vdvpp") pod "22be5575-57cd-4595-b3ad-b94ce3a00eef" (UID: "22be5575-57cd-4595-b3ad-b94ce3a00eef"). InnerVolumeSpecName "kube-api-access-vdvpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.913396 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22be5575-57cd-4595-b3ad-b94ce3a00eef" (UID: "22be5575-57cd-4595-b3ad-b94ce3a00eef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.985815 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be5575-57cd-4595-b3ad-b94ce3a00eef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:58 crc kubenswrapper[4759]: I1205 01:10:58.985842 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdvpp\" (UniqueName: \"kubernetes.io/projected/22be5575-57cd-4595-b3ad-b94ce3a00eef-kube-api-access-vdvpp\") on node \"crc\" DevicePath \"\"" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.317883 4759 generic.go:334] "Generic (PLEG): container finished" podID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerID="6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb" exitCode=0 Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.317935 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerDied","Data":"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb"} Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.317965 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbl8v" event={"ID":"22be5575-57cd-4595-b3ad-b94ce3a00eef","Type":"ContainerDied","Data":"35b3f7b729bfb85efb54b546b1ad80ca650536769fe593fd0365f170eb282010"} Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.317965 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbl8v" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.317987 4759 scope.go:117] "RemoveContainer" containerID="6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.350094 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.361656 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbl8v"] Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.364601 4759 scope.go:117] "RemoveContainer" containerID="d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.388436 4759 scope.go:117] "RemoveContainer" containerID="190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.452465 4759 scope.go:117] "RemoveContainer" containerID="6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb" Dec 05 01:10:59 crc kubenswrapper[4759]: E1205 01:10:59.452974 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb\": container with ID starting with 6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb not found: ID does not exist" containerID="6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.453025 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb"} err="failed to get container status \"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb\": rpc error: code = NotFound desc = could not find container \"6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb\": container with ID starting with 6915dbc4f61b7858e22be1c2b57249694e7364440db8b32975397ad9f1ad02eb not found: ID does not exist" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.453057 4759 scope.go:117] "RemoveContainer" containerID="d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b" Dec 05 01:10:59 crc kubenswrapper[4759]: E1205 01:10:59.453467 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b\": container with ID starting with d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b not found: ID does not exist" containerID="d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.453501 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b"} err="failed to get container status \"d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b\": rpc error: code = NotFound desc = could not find container \"d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b\": container with ID starting with d78de95a3ed6e35278c6168f9a71c80ad33f2f00145a889c65073f536f39c41b not found: ID does not exist" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.453522 4759 scope.go:117] "RemoveContainer" containerID="190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348" Dec 05 01:10:59 crc kubenswrapper[4759]: E1205 01:10:59.453802 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348\": container with ID starting with 190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348 not found: ID does not exist" containerID="190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348" Dec 05 01:10:59 crc kubenswrapper[4759]: I1205 01:10:59.453850 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348"} err="failed to get container status \"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348\": rpc error: code = NotFound desc = could not find container \"190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348\": container with ID starting with 190741f80b197f2320572dbd7c91ad1dfd646e0e2dd42a8d25f1363675857348 not found: ID does not exist" Dec 05 01:11:01 crc kubenswrapper[4759]: I1205 01:11:01.184891 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" path="/var/lib/kubelet/pods/22be5575-57cd-4595-b3ad-b94ce3a00eef/volumes" Dec 05 01:11:04 crc kubenswrapper[4759]: I1205 01:11:04.434053 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:11:04 crc kubenswrapper[4759]: I1205 01:11:04.434733 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:11:04 crc kubenswrapper[4759]: I1205 01:11:04.434796 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:11:04 crc kubenswrapper[4759]: I1205 01:11:04.435872 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:11:04 crc kubenswrapper[4759]: I1205 01:11:04.435967 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" gracePeriod=600 Dec 05 01:11:04 crc kubenswrapper[4759]: E1205 01:11:04.582531 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:11:05 crc kubenswrapper[4759]: I1205 01:11:05.389638 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" exitCode=0 Dec 05 01:11:05 crc kubenswrapper[4759]: I1205 01:11:05.389755 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade"} Dec 05 01:11:05 crc kubenswrapper[4759]: I1205 01:11:05.390119 4759 scope.go:117] "RemoveContainer" containerID="006ff0659038a9a56fe5eca39d856933d9115b2d975aabcc06e2ec6eb00ac4e8" Dec 05 01:11:05 crc kubenswrapper[4759]: I1205 01:11:05.391267 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:11:05 crc kubenswrapper[4759]: E1205 01:11:05.392077 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:11:17 crc kubenswrapper[4759]: I1205 01:11:17.156677 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:11:17 crc kubenswrapper[4759]: E1205 01:11:17.157733 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:11:30 crc kubenswrapper[4759]: I1205 01:11:30.156686 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:11:30 crc kubenswrapper[4759]: E1205 01:11:30.157503 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:11:45 crc kubenswrapper[4759]: I1205 01:11:45.160216 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:11:45 crc kubenswrapper[4759]: E1205 01:11:45.161084 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:11:59 crc kubenswrapper[4759]: I1205 01:11:59.156628 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:11:59 crc kubenswrapper[4759]: E1205 01:11:59.157377 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:12:11 crc kubenswrapper[4759]: I1205 01:12:11.164732 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:12:11 crc kubenswrapper[4759]: E1205 01:12:11.166067 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:12:26 crc kubenswrapper[4759]: I1205 01:12:26.156912 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:12:26 crc kubenswrapper[4759]: E1205 01:12:26.158729 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:12:39 crc kubenswrapper[4759]: I1205 01:12:39.159812 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:12:39 crc kubenswrapper[4759]: E1205 01:12:39.161153 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:12:52 crc kubenswrapper[4759]: I1205 01:12:52.155468 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:12:52 crc kubenswrapper[4759]: E1205 01:12:52.156602 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:13:03 crc kubenswrapper[4759]: I1205 01:13:03.156857 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:13:03 crc kubenswrapper[4759]: E1205 01:13:03.157701 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:13:18 crc kubenswrapper[4759]: I1205 01:13:18.156453 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:13:18 crc kubenswrapper[4759]: E1205 01:13:18.157481 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:13:33 crc kubenswrapper[4759]: I1205 01:13:33.157121 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:13:33 crc kubenswrapper[4759]: E1205 01:13:33.158091 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:13:44 crc kubenswrapper[4759]: I1205 01:13:44.156559 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:13:44 crc kubenswrapper[4759]: E1205 01:13:44.157628 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:13:57 crc kubenswrapper[4759]: I1205 01:13:57.155975 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:13:57 crc kubenswrapper[4759]: E1205 01:13:57.156888 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.557877 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:00 crc kubenswrapper[4759]: E1205 01:14:00.558979 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="registry-server" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.558995 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="registry-server" Dec 05 01:14:00 crc kubenswrapper[4759]: E1205 01:14:00.559004 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="extract-utilities" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.559010 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="extract-utilities" Dec 05 01:14:00 crc kubenswrapper[4759]: E1205 01:14:00.559026 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9ccd57-fa3c-429c-a23c-306bf24a4515" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.559033 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9ccd57-fa3c-429c-a23c-306bf24a4515" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:14:00 crc kubenswrapper[4759]: E1205 01:14:00.559060 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="extract-content" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.559066 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="extract-content" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.559273 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9ccd57-fa3c-429c-a23c-306bf24a4515" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.559327 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="22be5575-57cd-4595-b3ad-b94ce3a00eef" containerName="registry-server" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.561078 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.574712 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.612663 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.612726 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.612776 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26dt7\" (UniqueName: \"kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.715514 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.715583 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.715634 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26dt7\" (UniqueName: \"kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.716259 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.716371 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.739240 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26dt7\" (UniqueName: \"kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7\") pod \"redhat-operators-hhf8j\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:00 crc kubenswrapper[4759]: I1205 01:14:00.919128 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:01 crc kubenswrapper[4759]: I1205 01:14:01.432545 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:01 crc kubenswrapper[4759]: I1205 01:14:01.546918 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerStarted","Data":"eba2ec02fdc87af2b8a5cef520108340a46f0a47f703e203c287f945981b7565"} Dec 05 01:14:02 crc kubenswrapper[4759]: I1205 01:14:02.560745 4759 generic.go:334] "Generic (PLEG): container finished" podID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerID="cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408" exitCode=0 Dec 05 01:14:02 crc kubenswrapper[4759]: I1205 01:14:02.560849 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerDied","Data":"cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408"} Dec 05 01:14:02 crc kubenswrapper[4759]: I1205 01:14:02.567901 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:14:04 crc kubenswrapper[4759]: I1205 01:14:04.577577 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerStarted","Data":"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e"} Dec 05 01:14:05 crc kubenswrapper[4759]: I1205 01:14:05.590302 4759 generic.go:334] "Generic (PLEG): container finished" podID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerID="4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e" exitCode=0 Dec 05 01:14:05 crc kubenswrapper[4759]: I1205 01:14:05.590369 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerDied","Data":"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e"} Dec 05 01:14:05 crc kubenswrapper[4759]: E1205 01:14:05.712879 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d73a5f8_fe27_46c1_a732_10871c8d9d15.slice/crio-conmon-4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:14:07 crc kubenswrapper[4759]: I1205 01:14:07.622230 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerStarted","Data":"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc"} Dec 05 01:14:07 crc kubenswrapper[4759]: I1205 01:14:07.658097 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hhf8j" podStartSLOduration=3.639119677 podStartE2EDuration="7.658068465s" podCreationTimestamp="2025-12-05 01:14:00 +0000 UTC" firstStartedPulling="2025-12-05 01:14:02.567581367 +0000 UTC m=+3061.783242327" lastFinishedPulling="2025-12-05 01:14:06.586530165 +0000 UTC m=+3065.802191115" observedRunningTime="2025-12-05 01:14:07.647557749 +0000 UTC m=+3066.863218719" watchObservedRunningTime="2025-12-05 01:14:07.658068465 +0000 UTC m=+3066.873729435" Dec 05 01:14:10 crc kubenswrapper[4759]: I1205 01:14:10.920093 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:10 crc kubenswrapper[4759]: I1205 01:14:10.920583 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:11 crc kubenswrapper[4759]: I1205 01:14:11.988898 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hhf8j" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="registry-server" probeResult="failure" output=< Dec 05 01:14:11 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:14:11 crc kubenswrapper[4759]: > Dec 05 01:14:12 crc kubenswrapper[4759]: I1205 01:14:12.155867 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:14:12 crc kubenswrapper[4759]: E1205 01:14:12.156494 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:14:21 crc kubenswrapper[4759]: I1205 01:14:21.001778 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:21 crc kubenswrapper[4759]: I1205 01:14:21.066940 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:21 crc kubenswrapper[4759]: I1205 01:14:21.254835 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:22 crc kubenswrapper[4759]: I1205 01:14:22.810028 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hhf8j" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="registry-server" containerID="cri-o://2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc" gracePeriod=2 Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.305992 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.408256 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities\") pod \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.408362 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content\") pod \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.408404 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26dt7\" (UniqueName: \"kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7\") pod \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\" (UID: \"8d73a5f8-fe27-46c1-a732-10871c8d9d15\") " Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.409696 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities" (OuterVolumeSpecName: "utilities") pod "8d73a5f8-fe27-46c1-a732-10871c8d9d15" (UID: "8d73a5f8-fe27-46c1-a732-10871c8d9d15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.418277 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7" (OuterVolumeSpecName: "kube-api-access-26dt7") pod "8d73a5f8-fe27-46c1-a732-10871c8d9d15" (UID: "8d73a5f8-fe27-46c1-a732-10871c8d9d15"). InnerVolumeSpecName "kube-api-access-26dt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.510862 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26dt7\" (UniqueName: \"kubernetes.io/projected/8d73a5f8-fe27-46c1-a732-10871c8d9d15-kube-api-access-26dt7\") on node \"crc\" DevicePath \"\"" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.510907 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.527887 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d73a5f8-fe27-46c1-a732-10871c8d9d15" (UID: "8d73a5f8-fe27-46c1-a732-10871c8d9d15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.612355 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d73a5f8-fe27-46c1-a732-10871c8d9d15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.824789 4759 generic.go:334] "Generic (PLEG): container finished" podID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerID="2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc" exitCode=0 Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.824855 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerDied","Data":"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc"} Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.824910 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhf8j" event={"ID":"8d73a5f8-fe27-46c1-a732-10871c8d9d15","Type":"ContainerDied","Data":"eba2ec02fdc87af2b8a5cef520108340a46f0a47f703e203c287f945981b7565"} Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.824942 4759 scope.go:117] "RemoveContainer" containerID="2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.826742 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhf8j" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.878604 4759 scope.go:117] "RemoveContainer" containerID="4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e" Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.886819 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.895111 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hhf8j"] Dec 05 01:14:23 crc kubenswrapper[4759]: I1205 01:14:23.912006 4759 scope.go:117] "RemoveContainer" containerID="cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.054166 4759 scope.go:117] "RemoveContainer" containerID="2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc" Dec 05 01:14:24 crc kubenswrapper[4759]: E1205 01:14:24.055075 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc\": container with ID starting with 2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc not found: ID does not exist" containerID="2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.055109 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc"} err="failed to get container status \"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc\": rpc error: code = NotFound desc = could not find container \"2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc\": container with ID starting with 2e8eb30a94a316da96258736fbec3a2c38168c596fc7a144a14f4895cffab6fc not found: ID does not exist" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.055133 4759 scope.go:117] "RemoveContainer" containerID="4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e" Dec 05 01:14:24 crc kubenswrapper[4759]: E1205 01:14:24.057538 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e\": container with ID starting with 4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e not found: ID does not exist" containerID="4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.057586 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e"} err="failed to get container status \"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e\": rpc error: code = NotFound desc = could not find container \"4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e\": container with ID starting with 4b7ff6f3be2806d42a2e66bfa36eff4a0ae3752c80d2bee1ab5ae9ddc5f2b68e not found: ID does not exist" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.057618 4759 scope.go:117] "RemoveContainer" containerID="cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408" Dec 05 01:14:24 crc kubenswrapper[4759]: E1205 01:14:24.058129 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408\": container with ID starting with cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408 not found: ID does not exist" containerID="cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408" Dec 05 01:14:24 crc kubenswrapper[4759]: I1205 01:14:24.058177 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408"} err="failed to get container status \"cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408\": rpc error: code = NotFound desc = could not find container \"cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408\": container with ID starting with cac0dcdad0567f1a4cb30dadbd447baed60ab86d7139ba7c387d783f22a70408 not found: ID does not exist" Dec 05 01:14:25 crc kubenswrapper[4759]: I1205 01:14:25.185896 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" path="/var/lib/kubelet/pods/8d73a5f8-fe27-46c1-a732-10871c8d9d15/volumes" Dec 05 01:14:27 crc kubenswrapper[4759]: I1205 01:14:27.157212 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:14:27 crc kubenswrapper[4759]: E1205 01:14:27.157672 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:14:41 crc kubenswrapper[4759]: I1205 01:14:41.170002 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:14:41 crc kubenswrapper[4759]: E1205 01:14:41.171112 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:14:56 crc kubenswrapper[4759]: I1205 01:14:56.157007 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:14:56 crc kubenswrapper[4759]: E1205 01:14:56.157842 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.156271 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6"] Dec 05 01:15:00 crc kubenswrapper[4759]: E1205 01:15:00.157256 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="registry-server" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.157270 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="registry-server" Dec 05 01:15:00 crc kubenswrapper[4759]: E1205 01:15:00.157281 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="extract-content" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.157289 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="extract-content" Dec 05 01:15:00 crc kubenswrapper[4759]: E1205 01:15:00.157321 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="extract-utilities" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.157331 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="extract-utilities" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.157561 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d73a5f8-fe27-46c1-a732-10871c8d9d15" containerName="registry-server" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.158291 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.163958 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.164140 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.170281 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6"] Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.218099 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.218353 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjfs\" (UniqueName: \"kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.218490 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.325281 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjfs\" (UniqueName: \"kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.325554 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.325704 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.330467 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.333279 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.353073 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjfs\" (UniqueName: \"kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs\") pod \"collect-profiles-29414955-mcnw6\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:00 crc kubenswrapper[4759]: I1205 01:15:00.487510 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:01 crc kubenswrapper[4759]: I1205 01:15:01.009691 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6"] Dec 05 01:15:01 crc kubenswrapper[4759]: I1205 01:15:01.329610 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" event={"ID":"8a6b9c8c-896e-421a-9704-5e96eece6da1","Type":"ContainerStarted","Data":"9a9a63dddaf02b72e93eaa2a6a61152734c1b3b62db1b21469670b56fcc09018"} Dec 05 01:15:01 crc kubenswrapper[4759]: I1205 01:15:01.329660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" event={"ID":"8a6b9c8c-896e-421a-9704-5e96eece6da1","Type":"ContainerStarted","Data":"4aac0908187635f611d7210f1093ecb6536621030ccc0ece0f72b59c3579e2f3"} Dec 05 01:15:01 crc kubenswrapper[4759]: I1205 01:15:01.347749 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" podStartSLOduration=1.347732357 podStartE2EDuration="1.347732357s" podCreationTimestamp="2025-12-05 01:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:15:01.343597666 +0000 UTC m=+3120.559258616" watchObservedRunningTime="2025-12-05 01:15:01.347732357 +0000 UTC m=+3120.563393307" Dec 05 01:15:02 crc kubenswrapper[4759]: I1205 01:15:02.342699 4759 generic.go:334] "Generic (PLEG): container finished" podID="8a6b9c8c-896e-421a-9704-5e96eece6da1" containerID="9a9a63dddaf02b72e93eaa2a6a61152734c1b3b62db1b21469670b56fcc09018" exitCode=0 Dec 05 01:15:02 crc kubenswrapper[4759]: I1205 01:15:02.342811 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" event={"ID":"8a6b9c8c-896e-421a-9704-5e96eece6da1","Type":"ContainerDied","Data":"9a9a63dddaf02b72e93eaa2a6a61152734c1b3b62db1b21469670b56fcc09018"} Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.754082 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.823913 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjfs\" (UniqueName: \"kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs\") pod \"8a6b9c8c-896e-421a-9704-5e96eece6da1\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.823973 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume\") pod \"8a6b9c8c-896e-421a-9704-5e96eece6da1\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.824107 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume\") pod \"8a6b9c8c-896e-421a-9704-5e96eece6da1\" (UID: \"8a6b9c8c-896e-421a-9704-5e96eece6da1\") " Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.825357 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a6b9c8c-896e-421a-9704-5e96eece6da1" (UID: "8a6b9c8c-896e-421a-9704-5e96eece6da1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.829973 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs" (OuterVolumeSpecName: "kube-api-access-jcjfs") pod "8a6b9c8c-896e-421a-9704-5e96eece6da1" (UID: "8a6b9c8c-896e-421a-9704-5e96eece6da1"). InnerVolumeSpecName "kube-api-access-jcjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.831734 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a6b9c8c-896e-421a-9704-5e96eece6da1" (UID: "8a6b9c8c-896e-421a-9704-5e96eece6da1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.927358 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjfs\" (UniqueName: \"kubernetes.io/projected/8a6b9c8c-896e-421a-9704-5e96eece6da1-kube-api-access-jcjfs\") on node \"crc\" DevicePath \"\"" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.927404 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a6b9c8c-896e-421a-9704-5e96eece6da1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:15:03 crc kubenswrapper[4759]: I1205 01:15:03.927419 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a6b9c8c-896e-421a-9704-5e96eece6da1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:15:04 crc kubenswrapper[4759]: I1205 01:15:04.265965 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr"] Dec 05 01:15:04 crc kubenswrapper[4759]: I1205 01:15:04.284839 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414910-gbhcr"] Dec 05 01:15:04 crc kubenswrapper[4759]: I1205 01:15:04.364111 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" event={"ID":"8a6b9c8c-896e-421a-9704-5e96eece6da1","Type":"ContainerDied","Data":"4aac0908187635f611d7210f1093ecb6536621030ccc0ece0f72b59c3579e2f3"} Dec 05 01:15:04 crc kubenswrapper[4759]: I1205 01:15:04.364158 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aac0908187635f611d7210f1093ecb6536621030ccc0ece0f72b59c3579e2f3" Dec 05 01:15:04 crc kubenswrapper[4759]: I1205 01:15:04.364175 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6" Dec 05 01:15:05 crc kubenswrapper[4759]: I1205 01:15:05.179160 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9382a4-5036-4fb4-850c-e5a26d299f02" path="/var/lib/kubelet/pods/4d9382a4-5036-4fb4-850c-e5a26d299f02/volumes" Dec 05 01:15:10 crc kubenswrapper[4759]: I1205 01:15:10.155921 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:15:10 crc kubenswrapper[4759]: E1205 01:15:10.157532 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:15:22 crc kubenswrapper[4759]: I1205 01:15:22.156181 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:15:22 crc kubenswrapper[4759]: E1205 01:15:22.157515 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:15:29 crc kubenswrapper[4759]: I1205 01:15:29.752063 4759 scope.go:117] "RemoveContainer" containerID="c95db5087895e2b94b651c69c0250b8e9fae547296d801bb002c598b5146ff73" Dec 05 01:15:37 crc kubenswrapper[4759]: I1205 01:15:37.156437 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:15:37 crc kubenswrapper[4759]: E1205 01:15:37.158421 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:15:50 crc kubenswrapper[4759]: I1205 01:15:50.156677 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:15:50 crc kubenswrapper[4759]: E1205 01:15:50.158075 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:16:03 crc kubenswrapper[4759]: I1205 01:16:03.155888 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:16:03 crc kubenswrapper[4759]: E1205 01:16:03.157276 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:16:14 crc kubenswrapper[4759]: I1205 01:16:14.156209 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:16:15 crc kubenswrapper[4759]: I1205 01:16:15.324916 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32"} Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.190350 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.190643 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.208158 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.216929 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.225046 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.233124 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.242674 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.251034 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6wn2z"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.261611 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.271149 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.284742 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-92m6k"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.296431 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wbvls"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.305262 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.315850 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ktndb"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.325348 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xdwfq"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.332955 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-frmrf"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.340458 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2wjzb"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.349778 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.360296 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-db89f"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.383459 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.394113 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.404946 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4rh5"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.413817 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrzfq"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.421570 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-zzwzh"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.428512 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68b7l"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.437572 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.451047 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-92m6k"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.466944 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2thks"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.479622 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9tbds"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.495576 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xx2hm"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.506588 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh"] Dec 05 01:16:17 crc kubenswrapper[4759]: I1205 01:16:17.517724 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hbfbh"] Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.177884 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4c75d8-1c1c-43d5-b534-34d0b44decf9" path="/var/lib/kubelet/pods/2d4c75d8-1c1c-43d5-b534-34d0b44decf9/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.179682 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b" path="/var/lib/kubelet/pods/35b4fcd7-db7b-42c8-a3bf-46b2de8c7e2b/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.180861 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec53225-5ccb-4be7-af07-c86a1931fea9" path="/var/lib/kubelet/pods/4ec53225-5ccb-4be7-af07-c86a1931fea9/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.183193 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fdd474f-9093-4247-87af-9731a451fc7f" path="/var/lib/kubelet/pods/5fdd474f-9093-4247-87af-9731a451fc7f/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.184432 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a82f73-12c2-4a77-9bc4-500b26cacfa5" path="/var/lib/kubelet/pods/80a82f73-12c2-4a77-9bc4-500b26cacfa5/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.185619 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c0abf7-a6cc-411f-bccc-778819b2370d" path="/var/lib/kubelet/pods/92c0abf7-a6cc-411f-bccc-778819b2370d/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.186985 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f4137c-2b87-401f-ba79-56befbbd9757" path="/var/lib/kubelet/pods/95f4137c-2b87-401f-ba79-56befbbd9757/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.189168 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a691f6a0-4a00-491e-ad04-32f31f8dc175" path="/var/lib/kubelet/pods/a691f6a0-4a00-491e-ad04-32f31f8dc175/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.190432 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99c592a-6fa3-42dd-af6c-cfb8dc151bff" path="/var/lib/kubelet/pods/a99c592a-6fa3-42dd-af6c-cfb8dc151bff/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.191652 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb51317e-dc76-4d08-af63-d4719ab711d9" path="/var/lib/kubelet/pods/bb51317e-dc76-4d08-af63-d4719ab711d9/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.193816 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c1f555-7a29-4b6c-8047-46941df58dca" path="/var/lib/kubelet/pods/c0c1f555-7a29-4b6c-8047-46941df58dca/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.195420 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b3d06e-6304-425d-b688-524cfbf7ea5a" path="/var/lib/kubelet/pods/c3b3d06e-6304-425d-b688-524cfbf7ea5a/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.196740 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d871c1-ff7b-408a-85a2-3fe04612bdd4" path="/var/lib/kubelet/pods/d2d871c1-ff7b-408a-85a2-3fe04612bdd4/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.197944 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9ccd57-fa3c-429c-a23c-306bf24a4515" path="/var/lib/kubelet/pods/df9ccd57-fa3c-429c-a23c-306bf24a4515/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.200004 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd566ee-35b4-4a14-9683-3b93f9fb272e" path="/var/lib/kubelet/pods/dfd566ee-35b4-4a14-9683-3b93f9fb272e/volumes" Dec 05 01:16:19 crc kubenswrapper[4759]: I1205 01:16:19.201125 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0e55b2-fb6e-4259-a701-84c4599770c7" path="/var/lib/kubelet/pods/fe0e55b2-fb6e-4259-a701-84c4599770c7/volumes" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.000371 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x"] Dec 05 01:16:22 crc kubenswrapper[4759]: E1205 01:16:22.002418 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6b9c8c-896e-421a-9704-5e96eece6da1" containerName="collect-profiles" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.002541 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6b9c8c-896e-421a-9704-5e96eece6da1" containerName="collect-profiles" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.002879 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6b9c8c-896e-421a-9704-5e96eece6da1" containerName="collect-profiles" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.003922 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.006494 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.006840 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.006897 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.007032 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.008197 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.015052 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x"] Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.177514 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.177652 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftxmt\" (UniqueName: \"kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.177685 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.177832 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.177869 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.279495 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftxmt\" (UniqueName: \"kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.279551 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.279669 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.279703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.279792 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.286185 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.286712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.291790 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.295899 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.300259 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftxmt\" (UniqueName: \"kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:22 crc kubenswrapper[4759]: I1205 01:16:22.334939 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:23 crc kubenswrapper[4759]: I1205 01:16:23.012124 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x"] Dec 05 01:16:23 crc kubenswrapper[4759]: I1205 01:16:23.458028 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" event={"ID":"77d4cfb2-ced1-4306-a020-5ea1a3ed597c","Type":"ContainerStarted","Data":"35c715111deb1553c5eaf0d67d7a734a32e9c440a0c1148d96599594be37aa43"} Dec 05 01:16:24 crc kubenswrapper[4759]: I1205 01:16:24.475547 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" event={"ID":"77d4cfb2-ced1-4306-a020-5ea1a3ed597c","Type":"ContainerStarted","Data":"8d1093db633e0af7bac577740d0eeff01fd925c1a18a986b290e77280b72f915"} Dec 05 01:16:24 crc kubenswrapper[4759]: I1205 01:16:24.556523 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" podStartSLOduration=3.074280345 podStartE2EDuration="3.556501474s" podCreationTimestamp="2025-12-05 01:16:21 +0000 UTC" firstStartedPulling="2025-12-05 01:16:23.011662442 +0000 UTC m=+3202.227323392" lastFinishedPulling="2025-12-05 01:16:23.493883551 +0000 UTC m=+3202.709544521" observedRunningTime="2025-12-05 01:16:24.506496359 +0000 UTC m=+3203.722157309" watchObservedRunningTime="2025-12-05 01:16:24.556501474 +0000 UTC m=+3203.772162424" Dec 05 01:16:29 crc kubenswrapper[4759]: I1205 01:16:29.867259 4759 scope.go:117] "RemoveContainer" containerID="66956da93a9224be4a4ad241c32187e0ee258e662f8ccbf2d6d40fb5e91adcce" Dec 05 01:16:29 crc kubenswrapper[4759]: I1205 01:16:29.957641 4759 scope.go:117] "RemoveContainer" containerID="f395de7e32c6dbdc6f6fff9eb361f3cd29adf023fc795ef7718d925ee23e1746" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:29.999617 4759 scope.go:117] "RemoveContainer" containerID="f71bffd0bd259e97d4cb2adb2f2b8c5c7f20d86681f72ccec2d14892d6afb141" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.124780 4759 scope.go:117] "RemoveContainer" containerID="f8c4ef36814f43e4d7667fadf427f44517277f93c8611dc722c7e7b298f6ad7f" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.192201 4759 scope.go:117] "RemoveContainer" containerID="d0b1a1d0291c0cfff6e45d51a776ac2278f659e2aa0bc01a05a5ca5a762850c4" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.222258 4759 scope.go:117] "RemoveContainer" containerID="253f8a43f31a7ddb7a0ef92eb148f1bbbd98514c5fe45af5e92b18038e4d8157" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.292281 4759 scope.go:117] "RemoveContainer" containerID="7752af2f399c656c6fece04afcefc5133f386037a09abd5c35d132bbd6dd43d3" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.322836 4759 scope.go:117] "RemoveContainer" containerID="9203bed4864fb03e0cb1be01cc327b1b9e6cf1e55c83a28b6aa9d283ea03596c" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.405388 4759 scope.go:117] "RemoveContainer" containerID="c45d18f5975f8e50a07bbeed5c0c4b8d9ecabe797e96a53472f8efc06eefec3a" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.485406 4759 scope.go:117] "RemoveContainer" containerID="ba38081fb843b12b5683fc7f75b0990eb2badf6a7e9f6e7cb9056e4b3a53f741" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.570760 4759 scope.go:117] "RemoveContainer" containerID="c801697c7f41025dc5e6dd5f0c573bfc35f754dce7df3f2a04fffd73b4580b53" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.654646 4759 scope.go:117] "RemoveContainer" containerID="6574019cbcabd3499e8d1afaba101b64b04a81a6f8bd4ef12447cd3f528eba82" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.699385 4759 scope.go:117] "RemoveContainer" containerID="027a50605e6c7248a4b5999a339ea6c3e6c34d59c18643c8d709e752ba0cddb4" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.728892 4759 scope.go:117] "RemoveContainer" containerID="97c717893f976cc3ff54ad450decc2c75796424effb83e3348c113847f657353" Dec 05 01:16:30 crc kubenswrapper[4759]: I1205 01:16:30.783154 4759 scope.go:117] "RemoveContainer" containerID="b58828a728d123d6832ce7f98ab3312299d3d9ff44fc5d3e05f86eb3909ecd69" Dec 05 01:16:37 crc kubenswrapper[4759]: I1205 01:16:37.663357 4759 generic.go:334] "Generic (PLEG): container finished" podID="77d4cfb2-ced1-4306-a020-5ea1a3ed597c" containerID="8d1093db633e0af7bac577740d0eeff01fd925c1a18a986b290e77280b72f915" exitCode=0 Dec 05 01:16:37 crc kubenswrapper[4759]: I1205 01:16:37.663425 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" event={"ID":"77d4cfb2-ced1-4306-a020-5ea1a3ed597c","Type":"ContainerDied","Data":"8d1093db633e0af7bac577740d0eeff01fd925c1a18a986b290e77280b72f915"} Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.289659 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.445598 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftxmt\" (UniqueName: \"kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt\") pod \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.445673 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle\") pod \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.445741 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key\") pod \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.445859 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory\") pod \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.445896 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph\") pod \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\" (UID: \"77d4cfb2-ced1-4306-a020-5ea1a3ed597c\") " Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.452337 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt" (OuterVolumeSpecName: "kube-api-access-ftxmt") pod "77d4cfb2-ced1-4306-a020-5ea1a3ed597c" (UID: "77d4cfb2-ced1-4306-a020-5ea1a3ed597c"). InnerVolumeSpecName "kube-api-access-ftxmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.453355 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77d4cfb2-ced1-4306-a020-5ea1a3ed597c" (UID: "77d4cfb2-ced1-4306-a020-5ea1a3ed597c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.455449 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph" (OuterVolumeSpecName: "ceph") pod "77d4cfb2-ced1-4306-a020-5ea1a3ed597c" (UID: "77d4cfb2-ced1-4306-a020-5ea1a3ed597c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.487594 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory" (OuterVolumeSpecName: "inventory") pod "77d4cfb2-ced1-4306-a020-5ea1a3ed597c" (UID: "77d4cfb2-ced1-4306-a020-5ea1a3ed597c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.491606 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "77d4cfb2-ced1-4306-a020-5ea1a3ed597c" (UID: "77d4cfb2-ced1-4306-a020-5ea1a3ed597c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.548278 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftxmt\" (UniqueName: \"kubernetes.io/projected/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-kube-api-access-ftxmt\") on node \"crc\" DevicePath \"\"" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.548333 4759 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.548349 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.548362 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.548372 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/77d4cfb2-ced1-4306-a020-5ea1a3ed597c-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.685884 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" event={"ID":"77d4cfb2-ced1-4306-a020-5ea1a3ed597c","Type":"ContainerDied","Data":"35c715111deb1553c5eaf0d67d7a734a32e9c440a0c1148d96599594be37aa43"} Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.685934 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c715111deb1553c5eaf0d67d7a734a32e9c440a0c1148d96599594be37aa43" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.686008 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.799677 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp"] Dec 05 01:16:39 crc kubenswrapper[4759]: E1205 01:16:39.800146 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77d4cfb2-ced1-4306-a020-5ea1a3ed597c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.800173 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d4cfb2-ced1-4306-a020-5ea1a3ed597c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.800489 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="77d4cfb2-ced1-4306-a020-5ea1a3ed597c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.801428 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.805536 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.805701 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.805828 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.805861 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.806263 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.814447 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp"] Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.956253 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.956357 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfk8\" (UniqueName: \"kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.956385 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.956452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:39 crc kubenswrapper[4759]: I1205 01:16:39.956480 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.058763 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.058848 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.058958 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.059018 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfk8\" (UniqueName: \"kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.059045 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.063087 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.063113 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.063663 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.064117 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.082270 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfk8\" (UniqueName: \"kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:40 crc kubenswrapper[4759]: I1205 01:16:40.119562 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:16:41 crc kubenswrapper[4759]: I1205 01:16:41.066094 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp"] Dec 05 01:16:41 crc kubenswrapper[4759]: W1205 01:16:41.079206 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9542247a_5527_4bd2_bc5d_8bd30be01c1d.slice/crio-0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd WatchSource:0}: Error finding container 0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd: Status 404 returned error can't find the container with id 0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd Dec 05 01:16:41 crc kubenswrapper[4759]: I1205 01:16:41.709691 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" event={"ID":"9542247a-5527-4bd2-bc5d-8bd30be01c1d","Type":"ContainerStarted","Data":"babf2053be28ab77e1a81ac2e2265b4e3034b9b498e99c74a2e96e658fb7c36b"} Dec 05 01:16:41 crc kubenswrapper[4759]: I1205 01:16:41.710214 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" event={"ID":"9542247a-5527-4bd2-bc5d-8bd30be01c1d","Type":"ContainerStarted","Data":"0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd"} Dec 05 01:16:41 crc kubenswrapper[4759]: I1205 01:16:41.730299 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" podStartSLOduration=2.326558327 podStartE2EDuration="2.730278627s" podCreationTimestamp="2025-12-05 01:16:39 +0000 UTC" firstStartedPulling="2025-12-05 01:16:41.085239862 +0000 UTC m=+3220.300900802" lastFinishedPulling="2025-12-05 01:16:41.488960152 +0000 UTC m=+3220.704621102" observedRunningTime="2025-12-05 01:16:41.723650056 +0000 UTC m=+3220.939311006" watchObservedRunningTime="2025-12-05 01:16:41.730278627 +0000 UTC m=+3220.945939577" Dec 05 01:17:31 crc kubenswrapper[4759]: I1205 01:17:31.116862 4759 scope.go:117] "RemoveContainer" containerID="e9c34d65eec1b4d7620a0b28265fd37acc6247ba0f45181b3f795e56b8a489be" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.705683 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.709034 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.745659 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.752138 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.752354 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.752454 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbgl\" (UniqueName: \"kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.854790 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.855111 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbgl\" (UniqueName: \"kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.855208 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.855469 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.855769 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:57 crc kubenswrapper[4759]: I1205 01:17:57.875670 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbgl\" (UniqueName: \"kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl\") pod \"certified-operators-85t87\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:58 crc kubenswrapper[4759]: I1205 01:17:58.045512 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:17:58 crc kubenswrapper[4759]: I1205 01:17:58.529825 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:17:58 crc kubenswrapper[4759]: I1205 01:17:58.592446 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerStarted","Data":"bdbfea4a894942d6b122a8db812b0e60ef1788546da63cf76ae5171887071ccb"} Dec 05 01:17:59 crc kubenswrapper[4759]: I1205 01:17:59.610697 4759 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerID="6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac" exitCode=0 Dec 05 01:17:59 crc kubenswrapper[4759]: I1205 01:17:59.610800 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerDied","Data":"6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac"} Dec 05 01:18:01 crc kubenswrapper[4759]: I1205 01:18:01.643649 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerStarted","Data":"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad"} Dec 05 01:18:02 crc kubenswrapper[4759]: I1205 01:18:02.654772 4759 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerID="32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad" exitCode=0 Dec 05 01:18:02 crc kubenswrapper[4759]: I1205 01:18:02.655041 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerDied","Data":"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad"} Dec 05 01:18:03 crc kubenswrapper[4759]: I1205 01:18:03.681777 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerStarted","Data":"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b"} Dec 05 01:18:03 crc kubenswrapper[4759]: I1205 01:18:03.712889 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-85t87" podStartSLOduration=3.257309156 podStartE2EDuration="6.712871555s" podCreationTimestamp="2025-12-05 01:17:57 +0000 UTC" firstStartedPulling="2025-12-05 01:17:59.614016973 +0000 UTC m=+3298.829677923" lastFinishedPulling="2025-12-05 01:18:03.069579372 +0000 UTC m=+3302.285240322" observedRunningTime="2025-12-05 01:18:03.706357153 +0000 UTC m=+3302.922018113" watchObservedRunningTime="2025-12-05 01:18:03.712871555 +0000 UTC m=+3302.928532505" Dec 05 01:18:08 crc kubenswrapper[4759]: I1205 01:18:08.046398 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:08 crc kubenswrapper[4759]: I1205 01:18:08.046775 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:08 crc kubenswrapper[4759]: I1205 01:18:08.093387 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:08 crc kubenswrapper[4759]: I1205 01:18:08.794164 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:08 crc kubenswrapper[4759]: I1205 01:18:08.865752 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:18:10 crc kubenswrapper[4759]: I1205 01:18:10.765884 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-85t87" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="registry-server" containerID="cri-o://6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b" gracePeriod=2 Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.310347 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.367640 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbgl\" (UniqueName: \"kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl\") pod \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.367783 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities\") pod \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.367864 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content\") pod \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\" (UID: \"b6dc2362-a615-4969-ba6d-c3ad72b7fd77\") " Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.369103 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities" (OuterVolumeSpecName: "utilities") pod "b6dc2362-a615-4969-ba6d-c3ad72b7fd77" (UID: "b6dc2362-a615-4969-ba6d-c3ad72b7fd77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.374258 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl" (OuterVolumeSpecName: "kube-api-access-xkbgl") pod "b6dc2362-a615-4969-ba6d-c3ad72b7fd77" (UID: "b6dc2362-a615-4969-ba6d-c3ad72b7fd77"). InnerVolumeSpecName "kube-api-access-xkbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.435613 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6dc2362-a615-4969-ba6d-c3ad72b7fd77" (UID: "b6dc2362-a615-4969-ba6d-c3ad72b7fd77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.470998 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbgl\" (UniqueName: \"kubernetes.io/projected/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-kube-api-access-xkbgl\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.471023 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.471032 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2362-a615-4969-ba6d-c3ad72b7fd77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.777592 4759 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerID="6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b" exitCode=0 Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.777668 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerDied","Data":"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b"} Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.777729 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85t87" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.777755 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85t87" event={"ID":"b6dc2362-a615-4969-ba6d-c3ad72b7fd77","Type":"ContainerDied","Data":"bdbfea4a894942d6b122a8db812b0e60ef1788546da63cf76ae5171887071ccb"} Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.777815 4759 scope.go:117] "RemoveContainer" containerID="6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.809366 4759 scope.go:117] "RemoveContainer" containerID="32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.846019 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.851585 4759 scope.go:117] "RemoveContainer" containerID="6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.857025 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-85t87"] Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.904405 4759 scope.go:117] "RemoveContainer" containerID="6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b" Dec 05 01:18:11 crc kubenswrapper[4759]: E1205 01:18:11.904907 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b\": container with ID starting with 6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b not found: ID does not exist" containerID="6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.904959 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b"} err="failed to get container status \"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b\": rpc error: code = NotFound desc = could not find container \"6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b\": container with ID starting with 6811460e6ebf2c8a7fcb1dc6745ea8d8f7bb38434cdba43be3d3da56d9813e1b not found: ID does not exist" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.904994 4759 scope.go:117] "RemoveContainer" containerID="32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad" Dec 05 01:18:11 crc kubenswrapper[4759]: E1205 01:18:11.906472 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad\": container with ID starting with 32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad not found: ID does not exist" containerID="32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.906495 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad"} err="failed to get container status \"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad\": rpc error: code = NotFound desc = could not find container \"32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad\": container with ID starting with 32dfde793aa086656f42eeec3ff0049adde4e818c6ffc1fa9ef5b766815fa0ad not found: ID does not exist" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.906510 4759 scope.go:117] "RemoveContainer" containerID="6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac" Dec 05 01:18:11 crc kubenswrapper[4759]: E1205 01:18:11.906810 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac\": container with ID starting with 6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac not found: ID does not exist" containerID="6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac" Dec 05 01:18:11 crc kubenswrapper[4759]: I1205 01:18:11.906884 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac"} err="failed to get container status \"6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac\": rpc error: code = NotFound desc = could not find container \"6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac\": container with ID starting with 6d42c6c760c49686d05d935e6820290114658e155a24fe6839392433d27553ac not found: ID does not exist" Dec 05 01:18:13 crc kubenswrapper[4759]: I1205 01:18:13.202851 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" path="/var/lib/kubelet/pods/b6dc2362-a615-4969-ba6d-c3ad72b7fd77/volumes" Dec 05 01:18:34 crc kubenswrapper[4759]: I1205 01:18:34.433597 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:18:34 crc kubenswrapper[4759]: I1205 01:18:34.435123 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:18:40 crc kubenswrapper[4759]: I1205 01:18:40.132189 4759 generic.go:334] "Generic (PLEG): container finished" podID="9542247a-5527-4bd2-bc5d-8bd30be01c1d" containerID="babf2053be28ab77e1a81ac2e2265b4e3034b9b498e99c74a2e96e658fb7c36b" exitCode=0 Dec 05 01:18:40 crc kubenswrapper[4759]: I1205 01:18:40.132319 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" event={"ID":"9542247a-5527-4bd2-bc5d-8bd30be01c1d","Type":"ContainerDied","Data":"babf2053be28ab77e1a81ac2e2265b4e3034b9b498e99c74a2e96e658fb7c36b"} Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.584863 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.608683 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle\") pod \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.609995 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfk8\" (UniqueName: \"kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8\") pod \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.610161 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph\") pod \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.610191 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory\") pod \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.610358 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key\") pod \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\" (UID: \"9542247a-5527-4bd2-bc5d-8bd30be01c1d\") " Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.617483 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph" (OuterVolumeSpecName: "ceph") pod "9542247a-5527-4bd2-bc5d-8bd30be01c1d" (UID: "9542247a-5527-4bd2-bc5d-8bd30be01c1d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.617588 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8" (OuterVolumeSpecName: "kube-api-access-jwfk8") pod "9542247a-5527-4bd2-bc5d-8bd30be01c1d" (UID: "9542247a-5527-4bd2-bc5d-8bd30be01c1d"). InnerVolumeSpecName "kube-api-access-jwfk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.620479 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9542247a-5527-4bd2-bc5d-8bd30be01c1d" (UID: "9542247a-5527-4bd2-bc5d-8bd30be01c1d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.647991 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9542247a-5527-4bd2-bc5d-8bd30be01c1d" (UID: "9542247a-5527-4bd2-bc5d-8bd30be01c1d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.663893 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory" (OuterVolumeSpecName: "inventory") pod "9542247a-5527-4bd2-bc5d-8bd30be01c1d" (UID: "9542247a-5527-4bd2-bc5d-8bd30be01c1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.712250 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfk8\" (UniqueName: \"kubernetes.io/projected/9542247a-5527-4bd2-bc5d-8bd30be01c1d-kube-api-access-jwfk8\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.712286 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.712297 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.712321 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:41 crc kubenswrapper[4759]: I1205 01:18:41.712330 4759 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9542247a-5527-4bd2-bc5d-8bd30be01c1d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.155630 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" event={"ID":"9542247a-5527-4bd2-bc5d-8bd30be01c1d","Type":"ContainerDied","Data":"0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd"} Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.155673 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc706bf9dd8192f0f4654fabc5172bb158c28f6759906798abff09ade22dffd" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.155722 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.322079 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr"] Dec 05 01:18:42 crc kubenswrapper[4759]: E1205 01:18:42.322820 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="extract-content" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.322839 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="extract-content" Dec 05 01:18:42 crc kubenswrapper[4759]: E1205 01:18:42.322852 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9542247a-5527-4bd2-bc5d-8bd30be01c1d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.322860 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9542247a-5527-4bd2-bc5d-8bd30be01c1d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 01:18:42 crc kubenswrapper[4759]: E1205 01:18:42.322888 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="registry-server" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.322894 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="registry-server" Dec 05 01:18:42 crc kubenswrapper[4759]: E1205 01:18:42.322915 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="extract-utilities" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.322921 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="extract-utilities" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.323131 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dc2362-a615-4969-ba6d-c3ad72b7fd77" containerName="registry-server" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.323155 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9542247a-5527-4bd2-bc5d-8bd30be01c1d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.323891 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.326086 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.326090 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.329066 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.329211 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.329250 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.337726 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr"] Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.423722 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.424136 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45w4d\" (UniqueName: \"kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.424261 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.424455 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.526209 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45w4d\" (UniqueName: \"kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.526281 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.526428 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.526522 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.534005 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.534021 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.534070 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.542579 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45w4d\" (UniqueName: \"kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:42 crc kubenswrapper[4759]: I1205 01:18:42.638636 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:18:43 crc kubenswrapper[4759]: I1205 01:18:43.254856 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr"] Dec 05 01:18:43 crc kubenswrapper[4759]: W1205 01:18:43.268194 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef9f327_ed85_42f2_a400_624e7c84374b.slice/crio-f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d WatchSource:0}: Error finding container f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d: Status 404 returned error can't find the container with id f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d Dec 05 01:18:44 crc kubenswrapper[4759]: I1205 01:18:44.174141 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" event={"ID":"fef9f327-ed85-42f2-a400-624e7c84374b","Type":"ContainerStarted","Data":"d1ebfa98d9d4e0747a42eb86654d8dee304436a5f51c9f3092ab3453ddcef7b5"} Dec 05 01:18:44 crc kubenswrapper[4759]: I1205 01:18:44.175411 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" event={"ID":"fef9f327-ed85-42f2-a400-624e7c84374b","Type":"ContainerStarted","Data":"f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d"} Dec 05 01:18:44 crc kubenswrapper[4759]: I1205 01:18:44.194699 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" podStartSLOduration=1.627669129 podStartE2EDuration="2.19467356s" podCreationTimestamp="2025-12-05 01:18:42 +0000 UTC" firstStartedPulling="2025-12-05 01:18:43.270626922 +0000 UTC m=+3342.486287882" lastFinishedPulling="2025-12-05 01:18:43.837631343 +0000 UTC m=+3343.053292313" observedRunningTime="2025-12-05 01:18:44.191843374 +0000 UTC m=+3343.407504324" watchObservedRunningTime="2025-12-05 01:18:44.19467356 +0000 UTC m=+3343.410334510" Dec 05 01:19:04 crc kubenswrapper[4759]: I1205 01:19:04.434017 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:19:04 crc kubenswrapper[4759]: I1205 01:19:04.434765 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:19:17 crc kubenswrapper[4759]: I1205 01:19:17.519390 4759 generic.go:334] "Generic (PLEG): container finished" podID="fef9f327-ed85-42f2-a400-624e7c84374b" containerID="d1ebfa98d9d4e0747a42eb86654d8dee304436a5f51c9f3092ab3453ddcef7b5" exitCode=0 Dec 05 01:19:17 crc kubenswrapper[4759]: I1205 01:19:17.519452 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" event={"ID":"fef9f327-ed85-42f2-a400-624e7c84374b","Type":"ContainerDied","Data":"d1ebfa98d9d4e0747a42eb86654d8dee304436a5f51c9f3092ab3453ddcef7b5"} Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.078184 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.203795 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph\") pod \"fef9f327-ed85-42f2-a400-624e7c84374b\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.203911 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45w4d\" (UniqueName: \"kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d\") pod \"fef9f327-ed85-42f2-a400-624e7c84374b\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.204080 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory\") pod \"fef9f327-ed85-42f2-a400-624e7c84374b\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.204201 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key\") pod \"fef9f327-ed85-42f2-a400-624e7c84374b\" (UID: \"fef9f327-ed85-42f2-a400-624e7c84374b\") " Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.215139 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph" (OuterVolumeSpecName: "ceph") pod "fef9f327-ed85-42f2-a400-624e7c84374b" (UID: "fef9f327-ed85-42f2-a400-624e7c84374b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.215458 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d" (OuterVolumeSpecName: "kube-api-access-45w4d") pod "fef9f327-ed85-42f2-a400-624e7c84374b" (UID: "fef9f327-ed85-42f2-a400-624e7c84374b"). InnerVolumeSpecName "kube-api-access-45w4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.234978 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fef9f327-ed85-42f2-a400-624e7c84374b" (UID: "fef9f327-ed85-42f2-a400-624e7c84374b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.245995 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory" (OuterVolumeSpecName: "inventory") pod "fef9f327-ed85-42f2-a400-624e7c84374b" (UID: "fef9f327-ed85-42f2-a400-624e7c84374b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.306714 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.307573 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.307638 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45w4d\" (UniqueName: \"kubernetes.io/projected/fef9f327-ed85-42f2-a400-624e7c84374b-kube-api-access-45w4d\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.307662 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fef9f327-ed85-42f2-a400-624e7c84374b-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.544166 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" event={"ID":"fef9f327-ed85-42f2-a400-624e7c84374b","Type":"ContainerDied","Data":"f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d"} Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.544215 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f822ee6923875a94b2e0f3f16e90ea7d4400b543714c0371369bc0991b0e599d" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.544245 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.633015 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k"] Dec 05 01:19:19 crc kubenswrapper[4759]: E1205 01:19:19.633443 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef9f327-ed85-42f2-a400-624e7c84374b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.633460 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef9f327-ed85-42f2-a400-624e7c84374b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.633657 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef9f327-ed85-42f2-a400-624e7c84374b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.634414 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.636487 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.636734 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.636794 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.637346 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.638258 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.660465 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k"] Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.716237 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.716424 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.716472 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.716658 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p85dm\" (UniqueName: \"kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.817449 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.817524 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.817548 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.817615 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p85dm\" (UniqueName: \"kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.827863 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.828123 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.829088 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.836741 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p85dm\" (UniqueName: \"kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:19 crc kubenswrapper[4759]: I1205 01:19:19.954608 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:20 crc kubenswrapper[4759]: I1205 01:19:20.522565 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k"] Dec 05 01:19:20 crc kubenswrapper[4759]: I1205 01:19:20.529053 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:19:20 crc kubenswrapper[4759]: I1205 01:19:20.555990 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" event={"ID":"643a4a0e-1e9d-43a0-927c-ddb0778691f3","Type":"ContainerStarted","Data":"101d0441f9bebf74e46ae89ebd3289921a726e1a562dea675ff383af5a079c06"} Dec 05 01:19:21 crc kubenswrapper[4759]: I1205 01:19:21.566951 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" event={"ID":"643a4a0e-1e9d-43a0-927c-ddb0778691f3","Type":"ContainerStarted","Data":"603adb408a2c8ceebfb26e14a619090ef4375c823f3b99c9ccf626e19c31be44"} Dec 05 01:19:21 crc kubenswrapper[4759]: I1205 01:19:21.590505 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" podStartSLOduration=2.086688854 podStartE2EDuration="2.590481857s" podCreationTimestamp="2025-12-05 01:19:19 +0000 UTC" firstStartedPulling="2025-12-05 01:19:20.528685942 +0000 UTC m=+3379.744346922" lastFinishedPulling="2025-12-05 01:19:21.032478975 +0000 UTC m=+3380.248139925" observedRunningTime="2025-12-05 01:19:21.586065645 +0000 UTC m=+3380.801726595" watchObservedRunningTime="2025-12-05 01:19:21.590481857 +0000 UTC m=+3380.806142817" Dec 05 01:19:28 crc kubenswrapper[4759]: I1205 01:19:28.640800 4759 generic.go:334] "Generic (PLEG): container finished" podID="643a4a0e-1e9d-43a0-927c-ddb0778691f3" containerID="603adb408a2c8ceebfb26e14a619090ef4375c823f3b99c9ccf626e19c31be44" exitCode=0 Dec 05 01:19:28 crc kubenswrapper[4759]: I1205 01:19:28.640854 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" event={"ID":"643a4a0e-1e9d-43a0-927c-ddb0778691f3","Type":"ContainerDied","Data":"603adb408a2c8ceebfb26e14a619090ef4375c823f3b99c9ccf626e19c31be44"} Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.211952 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.331748 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key\") pod \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.332141 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p85dm\" (UniqueName: \"kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm\") pod \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.332222 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph\") pod \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.332382 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory\") pod \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\" (UID: \"643a4a0e-1e9d-43a0-927c-ddb0778691f3\") " Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.337908 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph" (OuterVolumeSpecName: "ceph") pod "643a4a0e-1e9d-43a0-927c-ddb0778691f3" (UID: "643a4a0e-1e9d-43a0-927c-ddb0778691f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.345767 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm" (OuterVolumeSpecName: "kube-api-access-p85dm") pod "643a4a0e-1e9d-43a0-927c-ddb0778691f3" (UID: "643a4a0e-1e9d-43a0-927c-ddb0778691f3"). InnerVolumeSpecName "kube-api-access-p85dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.364975 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "643a4a0e-1e9d-43a0-927c-ddb0778691f3" (UID: "643a4a0e-1e9d-43a0-927c-ddb0778691f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.381452 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory" (OuterVolumeSpecName: "inventory") pod "643a4a0e-1e9d-43a0-927c-ddb0778691f3" (UID: "643a4a0e-1e9d-43a0-927c-ddb0778691f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.435529 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p85dm\" (UniqueName: \"kubernetes.io/projected/643a4a0e-1e9d-43a0-927c-ddb0778691f3-kube-api-access-p85dm\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.435590 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.435611 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.435629 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/643a4a0e-1e9d-43a0-927c-ddb0778691f3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.660179 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" event={"ID":"643a4a0e-1e9d-43a0-927c-ddb0778691f3","Type":"ContainerDied","Data":"101d0441f9bebf74e46ae89ebd3289921a726e1a562dea675ff383af5a079c06"} Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.660693 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101d0441f9bebf74e46ae89ebd3289921a726e1a562dea675ff383af5a079c06" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.660392 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.815417 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt"] Dec 05 01:19:30 crc kubenswrapper[4759]: E1205 01:19:30.815904 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643a4a0e-1e9d-43a0-927c-ddb0778691f3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.815925 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="643a4a0e-1e9d-43a0-927c-ddb0778691f3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.816155 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="643a4a0e-1e9d-43a0-927c-ddb0778691f3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.817070 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.819517 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.820410 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.821362 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.821588 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.834484 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.834968 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt"] Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.959178 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.959278 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.959386 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:30 crc kubenswrapper[4759]: I1205 01:19:30.959494 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5s27\" (UniqueName: \"kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.061678 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.061739 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.061805 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.061901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5s27\" (UniqueName: \"kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.065954 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.066730 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.070757 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.086783 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5s27\" (UniqueName: \"kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n44mt\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.139725 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:19:31 crc kubenswrapper[4759]: I1205 01:19:31.771749 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt"] Dec 05 01:19:32 crc kubenswrapper[4759]: I1205 01:19:32.681882 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" event={"ID":"a0d1528b-9aba-49f6-982a-c0dc44cec8a8","Type":"ContainerStarted","Data":"98110d1c62a882ac7c3599b5aaa9aa3009cf896aba62da2db954a8aab2b16216"} Dec 05 01:19:32 crc kubenswrapper[4759]: I1205 01:19:32.682265 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" event={"ID":"a0d1528b-9aba-49f6-982a-c0dc44cec8a8","Type":"ContainerStarted","Data":"b805a3132ac9ad9ee8e2b7d56843ee6e606c14b9d6e8d3d3f1743f5f14f91631"} Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.433933 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.434538 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.434603 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.435649 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.435740 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32" gracePeriod=600 Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.702511 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32" exitCode=0 Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.702570 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32"} Dec 05 01:19:34 crc kubenswrapper[4759]: I1205 01:19:34.702801 4759 scope.go:117] "RemoveContainer" containerID="c289326b0bd8d5aa52d75a1efe54c1b7e8f3fe3ea0eebc86f544cac523abaade" Dec 05 01:19:35 crc kubenswrapper[4759]: I1205 01:19:35.714215 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d"} Dec 05 01:19:35 crc kubenswrapper[4759]: I1205 01:19:35.731600 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" podStartSLOduration=5.299003085 podStartE2EDuration="5.731583076s" podCreationTimestamp="2025-12-05 01:19:30 +0000 UTC" firstStartedPulling="2025-12-05 01:19:31.782607643 +0000 UTC m=+3390.998268603" lastFinishedPulling="2025-12-05 01:19:32.215187644 +0000 UTC m=+3391.430848594" observedRunningTime="2025-12-05 01:19:32.70700002 +0000 UTC m=+3391.922660980" watchObservedRunningTime="2025-12-05 01:19:35.731583076 +0000 UTC m=+3394.947244026" Dec 05 01:20:24 crc kubenswrapper[4759]: I1205 01:20:24.343181 4759 generic.go:334] "Generic (PLEG): container finished" podID="a0d1528b-9aba-49f6-982a-c0dc44cec8a8" containerID="98110d1c62a882ac7c3599b5aaa9aa3009cf896aba62da2db954a8aab2b16216" exitCode=0 Dec 05 01:20:24 crc kubenswrapper[4759]: I1205 01:20:24.343274 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" event={"ID":"a0d1528b-9aba-49f6-982a-c0dc44cec8a8","Type":"ContainerDied","Data":"98110d1c62a882ac7c3599b5aaa9aa3009cf896aba62da2db954a8aab2b16216"} Dec 05 01:20:25 crc kubenswrapper[4759]: I1205 01:20:25.935549 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.003217 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory\") pod \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.003447 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key\") pod \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.003600 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph\") pod \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.003631 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5s27\" (UniqueName: \"kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27\") pod \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\" (UID: \"a0d1528b-9aba-49f6-982a-c0dc44cec8a8\") " Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.009480 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27" (OuterVolumeSpecName: "kube-api-access-q5s27") pod "a0d1528b-9aba-49f6-982a-c0dc44cec8a8" (UID: "a0d1528b-9aba-49f6-982a-c0dc44cec8a8"). InnerVolumeSpecName "kube-api-access-q5s27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.010390 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph" (OuterVolumeSpecName: "ceph") pod "a0d1528b-9aba-49f6-982a-c0dc44cec8a8" (UID: "a0d1528b-9aba-49f6-982a-c0dc44cec8a8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.037160 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0d1528b-9aba-49f6-982a-c0dc44cec8a8" (UID: "a0d1528b-9aba-49f6-982a-c0dc44cec8a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.039513 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory" (OuterVolumeSpecName: "inventory") pod "a0d1528b-9aba-49f6-982a-c0dc44cec8a8" (UID: "a0d1528b-9aba-49f6-982a-c0dc44cec8a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.106446 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.106477 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.106490 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5s27\" (UniqueName: \"kubernetes.io/projected/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-kube-api-access-q5s27\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.106502 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d1528b-9aba-49f6-982a-c0dc44cec8a8-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.371840 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" event={"ID":"a0d1528b-9aba-49f6-982a-c0dc44cec8a8","Type":"ContainerDied","Data":"b805a3132ac9ad9ee8e2b7d56843ee6e606c14b9d6e8d3d3f1743f5f14f91631"} Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.372284 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b805a3132ac9ad9ee8e2b7d56843ee6e606c14b9d6e8d3d3f1743f5f14f91631" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.371914 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n44mt" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.484527 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2"] Dec 05 01:20:26 crc kubenswrapper[4759]: E1205 01:20:26.484983 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d1528b-9aba-49f6-982a-c0dc44cec8a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.485003 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d1528b-9aba-49f6-982a-c0dc44cec8a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.485199 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d1528b-9aba-49f6-982a-c0dc44cec8a8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.485962 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.488371 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.489394 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.489500 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.489665 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.491173 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.512141 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2"] Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.514294 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.514366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9bl\" (UniqueName: \"kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.514587 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.514839 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.618900 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.619206 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.619399 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.619521 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9bl\" (UniqueName: \"kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.624579 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.625462 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.632208 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.642375 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9bl\" (UniqueName: \"kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:26 crc kubenswrapper[4759]: I1205 01:20:26.815785 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:27 crc kubenswrapper[4759]: W1205 01:20:27.478012 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d7edc63_8bf1_4356_bc8a_c719049e0cee.slice/crio-c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0 WatchSource:0}: Error finding container c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0: Status 404 returned error can't find the container with id c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0 Dec 05 01:20:27 crc kubenswrapper[4759]: I1205 01:20:27.480192 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2"] Dec 05 01:20:28 crc kubenswrapper[4759]: I1205 01:20:28.408678 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" event={"ID":"8d7edc63-8bf1-4356-bc8a-c719049e0cee","Type":"ContainerStarted","Data":"c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0"} Dec 05 01:20:29 crc kubenswrapper[4759]: I1205 01:20:29.424756 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" event={"ID":"8d7edc63-8bf1-4356-bc8a-c719049e0cee","Type":"ContainerStarted","Data":"f3d7cceaf70fd372b9122dddb3f4acc216127d45bc5178d180339a2affea823e"} Dec 05 01:20:29 crc kubenswrapper[4759]: I1205 01:20:29.460092 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" podStartSLOduration=2.965969897 podStartE2EDuration="3.460066326s" podCreationTimestamp="2025-12-05 01:20:26 +0000 UTC" firstStartedPulling="2025-12-05 01:20:27.482842701 +0000 UTC m=+3446.698503681" lastFinishedPulling="2025-12-05 01:20:27.97693917 +0000 UTC m=+3447.192600110" observedRunningTime="2025-12-05 01:20:29.447121126 +0000 UTC m=+3448.662782116" watchObservedRunningTime="2025-12-05 01:20:29.460066326 +0000 UTC m=+3448.675727286" Dec 05 01:20:33 crc kubenswrapper[4759]: I1205 01:20:33.473385 4759 generic.go:334] "Generic (PLEG): container finished" podID="8d7edc63-8bf1-4356-bc8a-c719049e0cee" containerID="f3d7cceaf70fd372b9122dddb3f4acc216127d45bc5178d180339a2affea823e" exitCode=0 Dec 05 01:20:33 crc kubenswrapper[4759]: I1205 01:20:33.473457 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" event={"ID":"8d7edc63-8bf1-4356-bc8a-c719049e0cee","Type":"ContainerDied","Data":"f3d7cceaf70fd372b9122dddb3f4acc216127d45bc5178d180339a2affea823e"} Dec 05 01:20:34 crc kubenswrapper[4759]: I1205 01:20:34.994853 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.021944 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph\") pod \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.022205 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key\") pod \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.022486 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory\") pod \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.022818 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz9bl\" (UniqueName: \"kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl\") pod \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\" (UID: \"8d7edc63-8bf1-4356-bc8a-c719049e0cee\") " Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.035821 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl" (OuterVolumeSpecName: "kube-api-access-wz9bl") pod "8d7edc63-8bf1-4356-bc8a-c719049e0cee" (UID: "8d7edc63-8bf1-4356-bc8a-c719049e0cee"). InnerVolumeSpecName "kube-api-access-wz9bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.035883 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph" (OuterVolumeSpecName: "ceph") pod "8d7edc63-8bf1-4356-bc8a-c719049e0cee" (UID: "8d7edc63-8bf1-4356-bc8a-c719049e0cee"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.072357 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory" (OuterVolumeSpecName: "inventory") pod "8d7edc63-8bf1-4356-bc8a-c719049e0cee" (UID: "8d7edc63-8bf1-4356-bc8a-c719049e0cee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.077168 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d7edc63-8bf1-4356-bc8a-c719049e0cee" (UID: "8d7edc63-8bf1-4356-bc8a-c719049e0cee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.125425 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz9bl\" (UniqueName: \"kubernetes.io/projected/8d7edc63-8bf1-4356-bc8a-c719049e0cee-kube-api-access-wz9bl\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.125457 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.125466 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.125475 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7edc63-8bf1-4356-bc8a-c719049e0cee-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.506153 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" event={"ID":"8d7edc63-8bf1-4356-bc8a-c719049e0cee","Type":"ContainerDied","Data":"c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0"} Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.506212 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ecbdb26bc4c6a74625a5a76a8c809b02fd74dd46445a420c027e6d315239d0" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.506235 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.660807 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7"] Dec 05 01:20:35 crc kubenswrapper[4759]: E1205 01:20:35.661648 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7edc63-8bf1-4356-bc8a-c719049e0cee" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.661671 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7edc63-8bf1-4356-bc8a-c719049e0cee" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.662047 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d7edc63-8bf1-4356-bc8a-c719049e0cee" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.663122 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7"] Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.663205 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.667264 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.675116 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.675657 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.678225 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.689381 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.768153 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.768553 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.768674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cx9j\" (UniqueName: \"kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.768819 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.870896 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.871172 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.871265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cx9j\" (UniqueName: \"kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.871406 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.876030 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.877134 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.877907 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:35 crc kubenswrapper[4759]: I1205 01:20:35.900653 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cx9j\" (UniqueName: \"kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:36 crc kubenswrapper[4759]: I1205 01:20:36.019033 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:20:36 crc kubenswrapper[4759]: I1205 01:20:36.631056 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7"] Dec 05 01:20:37 crc kubenswrapper[4759]: I1205 01:20:37.536132 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" event={"ID":"3da88e6e-b264-4624-a173-5dd09edf5066","Type":"ContainerStarted","Data":"06ab4a80ec4700ff161ac82cd00c2debe8453c7439ac300377984111efdff75c"} Dec 05 01:20:37 crc kubenswrapper[4759]: I1205 01:20:37.536553 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" event={"ID":"3da88e6e-b264-4624-a173-5dd09edf5066","Type":"ContainerStarted","Data":"9daade5ade90d0d31cad89432945ca8eb99bea27a597e4434277c8b160a944c1"} Dec 05 01:20:37 crc kubenswrapper[4759]: I1205 01:20:37.555546 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" podStartSLOduration=2.126726698 podStartE2EDuration="2.555523332s" podCreationTimestamp="2025-12-05 01:20:35 +0000 UTC" firstStartedPulling="2025-12-05 01:20:36.642818617 +0000 UTC m=+3455.858479557" lastFinishedPulling="2025-12-05 01:20:37.071615231 +0000 UTC m=+3456.287276191" observedRunningTime="2025-12-05 01:20:37.552127904 +0000 UTC m=+3456.767788864" watchObservedRunningTime="2025-12-05 01:20:37.555523332 +0000 UTC m=+3456.771184292" Dec 05 01:21:34 crc kubenswrapper[4759]: I1205 01:21:34.433226 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:21:34 crc kubenswrapper[4759]: I1205 01:21:34.433885 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:21:42 crc kubenswrapper[4759]: I1205 01:21:42.283134 4759 generic.go:334] "Generic (PLEG): container finished" podID="3da88e6e-b264-4624-a173-5dd09edf5066" containerID="06ab4a80ec4700ff161ac82cd00c2debe8453c7439ac300377984111efdff75c" exitCode=0 Dec 05 01:21:42 crc kubenswrapper[4759]: I1205 01:21:42.283213 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" event={"ID":"3da88e6e-b264-4624-a173-5dd09edf5066","Type":"ContainerDied","Data":"06ab4a80ec4700ff161ac82cd00c2debe8453c7439ac300377984111efdff75c"} Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.815353 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.905524 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key\") pod \"3da88e6e-b264-4624-a173-5dd09edf5066\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.905587 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cx9j\" (UniqueName: \"kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j\") pod \"3da88e6e-b264-4624-a173-5dd09edf5066\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.905618 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph\") pod \"3da88e6e-b264-4624-a173-5dd09edf5066\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.905819 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory\") pod \"3da88e6e-b264-4624-a173-5dd09edf5066\" (UID: \"3da88e6e-b264-4624-a173-5dd09edf5066\") " Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.920108 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph" (OuterVolumeSpecName: "ceph") pod "3da88e6e-b264-4624-a173-5dd09edf5066" (UID: "3da88e6e-b264-4624-a173-5dd09edf5066"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.931699 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j" (OuterVolumeSpecName: "kube-api-access-5cx9j") pod "3da88e6e-b264-4624-a173-5dd09edf5066" (UID: "3da88e6e-b264-4624-a173-5dd09edf5066"). InnerVolumeSpecName "kube-api-access-5cx9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.941983 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory" (OuterVolumeSpecName: "inventory") pod "3da88e6e-b264-4624-a173-5dd09edf5066" (UID: "3da88e6e-b264-4624-a173-5dd09edf5066"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:21:43 crc kubenswrapper[4759]: I1205 01:21:43.947430 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3da88e6e-b264-4624-a173-5dd09edf5066" (UID: "3da88e6e-b264-4624-a173-5dd09edf5066"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.008563 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.008596 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.008605 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cx9j\" (UniqueName: \"kubernetes.io/projected/3da88e6e-b264-4624-a173-5dd09edf5066-kube-api-access-5cx9j\") on node \"crc\" DevicePath \"\"" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.008616 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da88e6e-b264-4624-a173-5dd09edf5066-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.307248 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" event={"ID":"3da88e6e-b264-4624-a173-5dd09edf5066","Type":"ContainerDied","Data":"9daade5ade90d0d31cad89432945ca8eb99bea27a597e4434277c8b160a944c1"} Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.307288 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9daade5ade90d0d31cad89432945ca8eb99bea27a597e4434277c8b160a944c1" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.307354 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.436680 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x9fpm"] Dec 05 01:21:44 crc kubenswrapper[4759]: E1205 01:21:44.437522 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da88e6e-b264-4624-a173-5dd09edf5066" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.437624 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da88e6e-b264-4624-a173-5dd09edf5066" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.438001 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da88e6e-b264-4624-a173-5dd09edf5066" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.438991 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.441000 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.441234 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.441921 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.442452 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.447811 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x9fpm"] Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.451021 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.524606 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.524715 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.524796 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kln\" (UniqueName: \"kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.524823 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.627557 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.627731 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kln\" (UniqueName: \"kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.627790 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.627887 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.633330 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.633806 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.634748 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.647923 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kln\" (UniqueName: \"kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln\") pod \"ssh-known-hosts-edpm-deployment-x9fpm\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:44 crc kubenswrapper[4759]: I1205 01:21:44.754875 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:21:45 crc kubenswrapper[4759]: I1205 01:21:45.314492 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x9fpm"] Dec 05 01:21:46 crc kubenswrapper[4759]: I1205 01:21:46.334948 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" event={"ID":"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef","Type":"ContainerStarted","Data":"ae5fc77d81b54f4692ec14641be534fe4cd7bcc201a09af404490040e21bdd41"} Dec 05 01:21:46 crc kubenswrapper[4759]: I1205 01:21:46.335401 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" event={"ID":"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef","Type":"ContainerStarted","Data":"f6bf4ae8f43f92406b8458f911d77cea75f1b4148830d8a7914298503d281716"} Dec 05 01:21:46 crc kubenswrapper[4759]: I1205 01:21:46.359846 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" podStartSLOduration=1.8222194680000001 podStartE2EDuration="2.359818076s" podCreationTimestamp="2025-12-05 01:21:44 +0000 UTC" firstStartedPulling="2025-12-05 01:21:45.322376638 +0000 UTC m=+3524.538037588" lastFinishedPulling="2025-12-05 01:21:45.859975206 +0000 UTC m=+3525.075636196" observedRunningTime="2025-12-05 01:21:46.3548208 +0000 UTC m=+3525.570481770" watchObservedRunningTime="2025-12-05 01:21:46.359818076 +0000 UTC m=+3525.575479066" Dec 05 01:21:59 crc kubenswrapper[4759]: I1205 01:21:59.523348 4759 generic.go:334] "Generic (PLEG): container finished" podID="cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" containerID="ae5fc77d81b54f4692ec14641be534fe4cd7bcc201a09af404490040e21bdd41" exitCode=0 Dec 05 01:21:59 crc kubenswrapper[4759]: I1205 01:21:59.523418 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" event={"ID":"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef","Type":"ContainerDied","Data":"ae5fc77d81b54f4692ec14641be534fe4cd7bcc201a09af404490040e21bdd41"} Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.151906 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.262546 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph\") pod \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.262895 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam\") pod \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.263078 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0\") pod \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.263132 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kln\" (UniqueName: \"kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln\") pod \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\" (UID: \"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef\") " Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.278512 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph" (OuterVolumeSpecName: "ceph") pod "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" (UID: "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.279229 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln" (OuterVolumeSpecName: "kube-api-access-l2kln") pod "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" (UID: "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef"). InnerVolumeSpecName "kube-api-access-l2kln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.301474 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" (UID: "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.315662 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" (UID: "cf4930f4-da24-4012-b8d1-1bcb0d5b0bef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.366373 4759 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.366418 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kln\" (UniqueName: \"kubernetes.io/projected/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-kube-api-access-l2kln\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.366436 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.366449 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf4930f4-da24-4012-b8d1-1bcb0d5b0bef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.553181 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" event={"ID":"cf4930f4-da24-4012-b8d1-1bcb0d5b0bef","Type":"ContainerDied","Data":"f6bf4ae8f43f92406b8458f911d77cea75f1b4148830d8a7914298503d281716"} Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.553242 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bf4ae8f43f92406b8458f911d77cea75f1b4148830d8a7914298503d281716" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.553278 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x9fpm" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.696011 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq"] Dec 05 01:22:01 crc kubenswrapper[4759]: E1205 01:22:01.697153 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" containerName="ssh-known-hosts-edpm-deployment" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.697201 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" containerName="ssh-known-hosts-edpm-deployment" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.697739 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4930f4-da24-4012-b8d1-1bcb0d5b0bef" containerName="ssh-known-hosts-edpm-deployment" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.707093 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.710772 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.711404 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.711660 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.711887 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.712114 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.713590 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq"] Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.880274 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk285\" (UniqueName: \"kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.880607 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.880704 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.880723 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.982604 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk285\" (UniqueName: \"kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.982871 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.983069 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.983169 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.993182 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.993200 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:01 crc kubenswrapper[4759]: I1205 01:22:01.993405 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:02 crc kubenswrapper[4759]: I1205 01:22:01.999957 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk285\" (UniqueName: \"kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4n8lq\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:02 crc kubenswrapper[4759]: I1205 01:22:02.034414 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:02 crc kubenswrapper[4759]: I1205 01:22:02.626463 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq"] Dec 05 01:22:03 crc kubenswrapper[4759]: I1205 01:22:03.581450 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" event={"ID":"9b0d860b-da25-46cf-abf7-17755154fc43","Type":"ContainerStarted","Data":"a19f587c582d426805daa697e1e841615c2a1bd9cc39cebb1e840cf43b21a00d"} Dec 05 01:22:03 crc kubenswrapper[4759]: I1205 01:22:03.582167 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" event={"ID":"9b0d860b-da25-46cf-abf7-17755154fc43","Type":"ContainerStarted","Data":"892fb6d4284019e378cc78eabcec79118989dc6382755301236dcc048d758d37"} Dec 05 01:22:03 crc kubenswrapper[4759]: I1205 01:22:03.604276 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" podStartSLOduration=2.1078193020000002 podStartE2EDuration="2.60424653s" podCreationTimestamp="2025-12-05 01:22:01 +0000 UTC" firstStartedPulling="2025-12-05 01:22:02.630920943 +0000 UTC m=+3541.846581893" lastFinishedPulling="2025-12-05 01:22:03.127348171 +0000 UTC m=+3542.343009121" observedRunningTime="2025-12-05 01:22:03.596019474 +0000 UTC m=+3542.811680444" watchObservedRunningTime="2025-12-05 01:22:03.60424653 +0000 UTC m=+3542.819907490" Dec 05 01:22:04 crc kubenswrapper[4759]: I1205 01:22:04.433993 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:22:04 crc kubenswrapper[4759]: I1205 01:22:04.434096 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:22:13 crc kubenswrapper[4759]: I1205 01:22:13.706202 4759 generic.go:334] "Generic (PLEG): container finished" podID="9b0d860b-da25-46cf-abf7-17755154fc43" containerID="a19f587c582d426805daa697e1e841615c2a1bd9cc39cebb1e840cf43b21a00d" exitCode=0 Dec 05 01:22:13 crc kubenswrapper[4759]: I1205 01:22:13.706335 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" event={"ID":"9b0d860b-da25-46cf-abf7-17755154fc43","Type":"ContainerDied","Data":"a19f587c582d426805daa697e1e841615c2a1bd9cc39cebb1e840cf43b21a00d"} Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.233597 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.282014 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph\") pod \"9b0d860b-da25-46cf-abf7-17755154fc43\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.282417 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory\") pod \"9b0d860b-da25-46cf-abf7-17755154fc43\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.282451 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key\") pod \"9b0d860b-da25-46cf-abf7-17755154fc43\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.282477 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk285\" (UniqueName: \"kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285\") pod \"9b0d860b-da25-46cf-abf7-17755154fc43\" (UID: \"9b0d860b-da25-46cf-abf7-17755154fc43\") " Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.292011 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285" (OuterVolumeSpecName: "kube-api-access-nk285") pod "9b0d860b-da25-46cf-abf7-17755154fc43" (UID: "9b0d860b-da25-46cf-abf7-17755154fc43"). InnerVolumeSpecName "kube-api-access-nk285". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.295499 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph" (OuterVolumeSpecName: "ceph") pod "9b0d860b-da25-46cf-abf7-17755154fc43" (UID: "9b0d860b-da25-46cf-abf7-17755154fc43"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.320926 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory" (OuterVolumeSpecName: "inventory") pod "9b0d860b-da25-46cf-abf7-17755154fc43" (UID: "9b0d860b-da25-46cf-abf7-17755154fc43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.334902 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b0d860b-da25-46cf-abf7-17755154fc43" (UID: "9b0d860b-da25-46cf-abf7-17755154fc43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.389225 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.389260 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.389270 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b0d860b-da25-46cf-abf7-17755154fc43-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.389279 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk285\" (UniqueName: \"kubernetes.io/projected/9b0d860b-da25-46cf-abf7-17755154fc43-kube-api-access-nk285\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.729222 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" event={"ID":"9b0d860b-da25-46cf-abf7-17755154fc43","Type":"ContainerDied","Data":"892fb6d4284019e378cc78eabcec79118989dc6382755301236dcc048d758d37"} Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.729321 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892fb6d4284019e378cc78eabcec79118989dc6382755301236dcc048d758d37" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.729402 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4n8lq" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.832948 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw"] Dec 05 01:22:15 crc kubenswrapper[4759]: E1205 01:22:15.833389 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0d860b-da25-46cf-abf7-17755154fc43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.833408 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0d860b-da25-46cf-abf7-17755154fc43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.833602 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0d860b-da25-46cf-abf7-17755154fc43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.834286 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.837630 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.837801 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.838028 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.838161 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.838282 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.852203 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw"] Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.901276 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dctd\" (UniqueName: \"kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.901679 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.901766 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:15 crc kubenswrapper[4759]: I1205 01:22:15.901814 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.005860 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dctd\" (UniqueName: \"kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.005980 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.006101 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.006186 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.012016 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.012165 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.012778 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.027745 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dctd\" (UniqueName: \"kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.157409 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.729773 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw"] Dec 05 01:22:16 crc kubenswrapper[4759]: I1205 01:22:16.741252 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" event={"ID":"81fe58fd-9cda-4705-9755-2b9bb62211f7","Type":"ContainerStarted","Data":"2f069c4d223e4107d6c7678d3e3b9dd8a7fd1f746352a1d40694d5bfb215e80e"} Dec 05 01:22:17 crc kubenswrapper[4759]: I1205 01:22:17.755484 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" event={"ID":"81fe58fd-9cda-4705-9755-2b9bb62211f7","Type":"ContainerStarted","Data":"aabe9287c33edd4fa4b8e9ea9241afcf3691b3c1035c25ed0b106d6ca3d6c0be"} Dec 05 01:22:17 crc kubenswrapper[4759]: I1205 01:22:17.773836 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" podStartSLOduration=2.355570074 podStartE2EDuration="2.773818358s" podCreationTimestamp="2025-12-05 01:22:15 +0000 UTC" firstStartedPulling="2025-12-05 01:22:16.729697896 +0000 UTC m=+3555.945358856" lastFinishedPulling="2025-12-05 01:22:17.14794617 +0000 UTC m=+3556.363607140" observedRunningTime="2025-12-05 01:22:17.771765473 +0000 UTC m=+3556.987426433" watchObservedRunningTime="2025-12-05 01:22:17.773818358 +0000 UTC m=+3556.989479308" Dec 05 01:22:30 crc kubenswrapper[4759]: I1205 01:22:30.934769 4759 generic.go:334] "Generic (PLEG): container finished" podID="81fe58fd-9cda-4705-9755-2b9bb62211f7" containerID="aabe9287c33edd4fa4b8e9ea9241afcf3691b3c1035c25ed0b106d6ca3d6c0be" exitCode=0 Dec 05 01:22:30 crc kubenswrapper[4759]: I1205 01:22:30.935560 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" event={"ID":"81fe58fd-9cda-4705-9755-2b9bb62211f7","Type":"ContainerDied","Data":"aabe9287c33edd4fa4b8e9ea9241afcf3691b3c1035c25ed0b106d6ca3d6c0be"} Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.455983 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.616512 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dctd\" (UniqueName: \"kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd\") pod \"81fe58fd-9cda-4705-9755-2b9bb62211f7\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.616600 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph\") pod \"81fe58fd-9cda-4705-9755-2b9bb62211f7\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.616734 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key\") pod \"81fe58fd-9cda-4705-9755-2b9bb62211f7\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.616768 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory\") pod \"81fe58fd-9cda-4705-9755-2b9bb62211f7\" (UID: \"81fe58fd-9cda-4705-9755-2b9bb62211f7\") " Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.628544 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd" (OuterVolumeSpecName: "kube-api-access-8dctd") pod "81fe58fd-9cda-4705-9755-2b9bb62211f7" (UID: "81fe58fd-9cda-4705-9755-2b9bb62211f7"). InnerVolumeSpecName "kube-api-access-8dctd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.642037 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph" (OuterVolumeSpecName: "ceph") pod "81fe58fd-9cda-4705-9755-2b9bb62211f7" (UID: "81fe58fd-9cda-4705-9755-2b9bb62211f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.671231 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81fe58fd-9cda-4705-9755-2b9bb62211f7" (UID: "81fe58fd-9cda-4705-9755-2b9bb62211f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.674751 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory" (OuterVolumeSpecName: "inventory") pod "81fe58fd-9cda-4705-9755-2b9bb62211f7" (UID: "81fe58fd-9cda-4705-9755-2b9bb62211f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.719465 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.719496 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.719507 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81fe58fd-9cda-4705-9755-2b9bb62211f7-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.719528 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dctd\" (UniqueName: \"kubernetes.io/projected/81fe58fd-9cda-4705-9755-2b9bb62211f7-kube-api-access-8dctd\") on node \"crc\" DevicePath \"\"" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.961487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" event={"ID":"81fe58fd-9cda-4705-9755-2b9bb62211f7","Type":"ContainerDied","Data":"2f069c4d223e4107d6c7678d3e3b9dd8a7fd1f746352a1d40694d5bfb215e80e"} Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.961565 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f069c4d223e4107d6c7678d3e3b9dd8a7fd1f746352a1d40694d5bfb215e80e" Dec 05 01:22:32 crc kubenswrapper[4759]: I1205 01:22:32.961517 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.092886 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k"] Dec 05 01:22:33 crc kubenswrapper[4759]: E1205 01:22:33.093366 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fe58fd-9cda-4705-9755-2b9bb62211f7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.093388 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fe58fd-9cda-4705-9755-2b9bb62211f7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.093614 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fe58fd-9cda-4705-9755-2b9bb62211f7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.094390 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.097104 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.097196 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.097813 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.097845 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.097877 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.098366 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.098426 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.098531 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.098905 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.100092 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.125552 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k"] Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.158768 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.158838 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.158897 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.158926 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.158983 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159017 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159042 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159073 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159096 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159132 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159155 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159180 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159205 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159237 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159283 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159327 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98mfd\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.159367 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.261977 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262038 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262067 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262108 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262133 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262157 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262185 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262219 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262263 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262317 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98mfd\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262353 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262419 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262460 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262575 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262613 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262676 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.262709 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.269157 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.269206 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.270026 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.270432 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.271177 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.279702 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.280178 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.281164 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.283189 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.283253 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.290860 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98mfd\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.291418 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.292082 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.292104 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.293251 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.295066 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.298549 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.411966 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.928804 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k"] Dec 05 01:22:33 crc kubenswrapper[4759]: W1205 01:22:33.935516 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361afa29_23e5_45e4_8c9d_c7da34c4b1ac.slice/crio-ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012 WatchSource:0}: Error finding container ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012: Status 404 returned error can't find the container with id ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012 Dec 05 01:22:33 crc kubenswrapper[4759]: I1205 01:22:33.976538 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" event={"ID":"361afa29-23e5-45e4-8c9d-c7da34c4b1ac","Type":"ContainerStarted","Data":"ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012"} Dec 05 01:22:34 crc kubenswrapper[4759]: I1205 01:22:34.433938 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:22:34 crc kubenswrapper[4759]: I1205 01:22:34.434875 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:22:34 crc kubenswrapper[4759]: I1205 01:22:34.435107 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:22:34 crc kubenswrapper[4759]: I1205 01:22:34.436681 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:22:34 crc kubenswrapper[4759]: I1205 01:22:34.436979 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" gracePeriod=600 Dec 05 01:22:34 crc kubenswrapper[4759]: E1205 01:22:34.569075 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:22:35 crc kubenswrapper[4759]: I1205 01:22:35.005452 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" exitCode=0 Dec 05 01:22:35 crc kubenswrapper[4759]: I1205 01:22:35.005509 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d"} Dec 05 01:22:35 crc kubenswrapper[4759]: I1205 01:22:35.005556 4759 scope.go:117] "RemoveContainer" containerID="666b9dc92fa2ce40e04f9e0757fc98a1d90be948a6233de9b66c266d66b60f32" Dec 05 01:22:35 crc kubenswrapper[4759]: I1205 01:22:35.007385 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:22:35 crc kubenswrapper[4759]: E1205 01:22:35.015246 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:22:36 crc kubenswrapper[4759]: I1205 01:22:36.027022 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" event={"ID":"361afa29-23e5-45e4-8c9d-c7da34c4b1ac","Type":"ContainerStarted","Data":"878a045465f03bdc0e7d1f12b123e9267db20a26a81d073c0a040026d87052c7"} Dec 05 01:22:36 crc kubenswrapper[4759]: I1205 01:22:36.068844 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" podStartSLOduration=2.120380716 podStartE2EDuration="3.068821769s" podCreationTimestamp="2025-12-05 01:22:33 +0000 UTC" firstStartedPulling="2025-12-05 01:22:33.93777385 +0000 UTC m=+3573.153434800" lastFinishedPulling="2025-12-05 01:22:34.886214883 +0000 UTC m=+3574.101875853" observedRunningTime="2025-12-05 01:22:36.053750467 +0000 UTC m=+3575.269411417" watchObservedRunningTime="2025-12-05 01:22:36.068821769 +0000 UTC m=+3575.284482729" Dec 05 01:22:46 crc kubenswrapper[4759]: I1205 01:22:46.162106 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:22:46 crc kubenswrapper[4759]: E1205 01:22:46.163692 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:00 crc kubenswrapper[4759]: I1205 01:23:00.156560 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:23:00 crc kubenswrapper[4759]: E1205 01:23:00.158684 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:11 crc kubenswrapper[4759]: I1205 01:23:11.178000 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:23:11 crc kubenswrapper[4759]: E1205 01:23:11.178880 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:25 crc kubenswrapper[4759]: I1205 01:23:25.157763 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:23:25 crc kubenswrapper[4759]: E1205 01:23:25.158599 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:38 crc kubenswrapper[4759]: I1205 01:23:38.157138 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:23:38 crc kubenswrapper[4759]: E1205 01:23:38.158278 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:43 crc kubenswrapper[4759]: E1205 01:23:43.115727 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361afa29_23e5_45e4_8c9d_c7da34c4b1ac.slice/crio-conmon-878a045465f03bdc0e7d1f12b123e9267db20a26a81d073c0a040026d87052c7.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:23:43 crc kubenswrapper[4759]: I1205 01:23:43.868572 4759 generic.go:334] "Generic (PLEG): container finished" podID="361afa29-23e5-45e4-8c9d-c7da34c4b1ac" containerID="878a045465f03bdc0e7d1f12b123e9267db20a26a81d073c0a040026d87052c7" exitCode=0 Dec 05 01:23:43 crc kubenswrapper[4759]: I1205 01:23:43.868642 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" event={"ID":"361afa29-23e5-45e4-8c9d-c7da34c4b1ac","Type":"ContainerDied","Data":"878a045465f03bdc0e7d1f12b123e9267db20a26a81d073c0a040026d87052c7"} Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.339535 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.457371 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.457802 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.457822 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.457881 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.457941 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458000 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458033 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458116 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458198 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458233 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458267 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458292 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458341 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458360 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458378 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458398 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.458431 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98mfd\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd\") pod \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\" (UID: \"361afa29-23e5-45e4-8c9d-c7da34c4b1ac\") " Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.465753 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.466392 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.468678 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.469924 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.470161 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd" (OuterVolumeSpecName: "kube-api-access-98mfd") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "kube-api-access-98mfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.470558 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.471151 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.472548 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.477587 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.477631 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.479561 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.480596 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.484776 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.485507 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.496940 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph" (OuterVolumeSpecName: "ceph") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.516511 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.522535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory" (OuterVolumeSpecName: "inventory") pod "361afa29-23e5-45e4-8c9d-c7da34c4b1ac" (UID: "361afa29-23e5-45e4-8c9d-c7da34c4b1ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561143 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561191 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561202 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561235 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561245 4759 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561255 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561264 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561273 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561282 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561325 4759 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561335 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561347 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561358 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561389 4759 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561399 4759 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561408 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.561416 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98mfd\" (UniqueName: \"kubernetes.io/projected/361afa29-23e5-45e4-8c9d-c7da34c4b1ac-kube-api-access-98mfd\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.899967 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" event={"ID":"361afa29-23e5-45e4-8c9d-c7da34c4b1ac","Type":"ContainerDied","Data":"ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012"} Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.900015 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae99cea3f1b85ade31b1778a6f26017f904b7b39bc4a1d012bec159b2211b012" Dec 05 01:23:45 crc kubenswrapper[4759]: I1205 01:23:45.900151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.097822 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd"] Dec 05 01:23:46 crc kubenswrapper[4759]: E1205 01:23:46.098224 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361afa29-23e5-45e4-8c9d-c7da34c4b1ac" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.098243 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="361afa29-23e5-45e4-8c9d-c7da34c4b1ac" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.098498 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="361afa29-23e5-45e4-8c9d-c7da34c4b1ac" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.099196 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.102535 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.102562 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.102593 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.103061 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.108397 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.110024 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd"] Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.173224 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.173557 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.173691 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.173730 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5jj\" (UniqueName: \"kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.275519 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.275663 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.275707 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5jj\" (UniqueName: \"kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.275974 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.280423 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.280869 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.287161 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.301966 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5jj\" (UniqueName: \"kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:46 crc kubenswrapper[4759]: I1205 01:23:46.423910 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:47 crc kubenswrapper[4759]: I1205 01:23:47.102756 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd"] Dec 05 01:23:47 crc kubenswrapper[4759]: I1205 01:23:47.924780 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" event={"ID":"76e169bb-796f-43c4-a487-36cf0c3d13a0","Type":"ContainerStarted","Data":"dc8c97997afd43fbe8fe9ff5af1bd4deff5747c428853c00e150359ff6ca82e1"} Dec 05 01:23:47 crc kubenswrapper[4759]: I1205 01:23:47.925142 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" event={"ID":"76e169bb-796f-43c4-a487-36cf0c3d13a0","Type":"ContainerStarted","Data":"5e41e901d91961045ba7c1d8e77085a4c41df0f3745a35a305de17676c7a6c78"} Dec 05 01:23:47 crc kubenswrapper[4759]: I1205 01:23:47.943964 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" podStartSLOduration=1.436095777 podStartE2EDuration="1.943941629s" podCreationTimestamp="2025-12-05 01:23:46 +0000 UTC" firstStartedPulling="2025-12-05 01:23:47.111969078 +0000 UTC m=+3646.327630028" lastFinishedPulling="2025-12-05 01:23:47.61981491 +0000 UTC m=+3646.835475880" observedRunningTime="2025-12-05 01:23:47.93745157 +0000 UTC m=+3647.153112520" watchObservedRunningTime="2025-12-05 01:23:47.943941629 +0000 UTC m=+3647.159602599" Dec 05 01:23:53 crc kubenswrapper[4759]: I1205 01:23:53.155819 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:23:53 crc kubenswrapper[4759]: E1205 01:23:53.156523 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:23:56 crc kubenswrapper[4759]: I1205 01:23:56.005955 4759 generic.go:334] "Generic (PLEG): container finished" podID="76e169bb-796f-43c4-a487-36cf0c3d13a0" containerID="dc8c97997afd43fbe8fe9ff5af1bd4deff5747c428853c00e150359ff6ca82e1" exitCode=0 Dec 05 01:23:56 crc kubenswrapper[4759]: I1205 01:23:56.006501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" event={"ID":"76e169bb-796f-43c4-a487-36cf0c3d13a0","Type":"ContainerDied","Data":"dc8c97997afd43fbe8fe9ff5af1bd4deff5747c428853c00e150359ff6ca82e1"} Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.486942 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.604292 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph\") pod \"76e169bb-796f-43c4-a487-36cf0c3d13a0\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.604517 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key\") pod \"76e169bb-796f-43c4-a487-36cf0c3d13a0\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.604691 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm5jj\" (UniqueName: \"kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj\") pod \"76e169bb-796f-43c4-a487-36cf0c3d13a0\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.605515 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory\") pod \"76e169bb-796f-43c4-a487-36cf0c3d13a0\" (UID: \"76e169bb-796f-43c4-a487-36cf0c3d13a0\") " Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.611090 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph" (OuterVolumeSpecName: "ceph") pod "76e169bb-796f-43c4-a487-36cf0c3d13a0" (UID: "76e169bb-796f-43c4-a487-36cf0c3d13a0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.611138 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj" (OuterVolumeSpecName: "kube-api-access-mm5jj") pod "76e169bb-796f-43c4-a487-36cf0c3d13a0" (UID: "76e169bb-796f-43c4-a487-36cf0c3d13a0"). InnerVolumeSpecName "kube-api-access-mm5jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.642951 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76e169bb-796f-43c4-a487-36cf0c3d13a0" (UID: "76e169bb-796f-43c4-a487-36cf0c3d13a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.643550 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory" (OuterVolumeSpecName: "inventory") pod "76e169bb-796f-43c4-a487-36cf0c3d13a0" (UID: "76e169bb-796f-43c4-a487-36cf0c3d13a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.707684 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.707716 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.707727 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm5jj\" (UniqueName: \"kubernetes.io/projected/76e169bb-796f-43c4-a487-36cf0c3d13a0-kube-api-access-mm5jj\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:57 crc kubenswrapper[4759]: I1205 01:23:57.707735 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e169bb-796f-43c4-a487-36cf0c3d13a0-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.033840 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" event={"ID":"76e169bb-796f-43c4-a487-36cf0c3d13a0","Type":"ContainerDied","Data":"5e41e901d91961045ba7c1d8e77085a4c41df0f3745a35a305de17676c7a6c78"} Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.034266 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e41e901d91961045ba7c1d8e77085a4c41df0f3745a35a305de17676c7a6c78" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.033951 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.157927 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8"] Dec 05 01:23:58 crc kubenswrapper[4759]: E1205 01:23:58.159671 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e169bb-796f-43c4-a487-36cf0c3d13a0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.159700 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e169bb-796f-43c4-a487-36cf0c3d13a0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.160018 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e169bb-796f-43c4-a487-36cf0c3d13a0" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.161035 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.166204 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8"] Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.175471 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.175641 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.175710 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.175918 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.176073 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.176567 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.320702 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.320880 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.321113 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.322452 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.322705 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95ms\" (UniqueName: \"kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.322969 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.425442 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.425607 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.425671 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.425769 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f95ms\" (UniqueName: \"kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.425901 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.426054 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.427382 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.433704 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.436909 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.437626 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.445833 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.447456 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95ms\" (UniqueName: \"kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qc2t8\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:58 crc kubenswrapper[4759]: I1205 01:23:58.498781 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:23:59 crc kubenswrapper[4759]: I1205 01:23:59.052937 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8"] Dec 05 01:24:00 crc kubenswrapper[4759]: I1205 01:24:00.067832 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" event={"ID":"5ec426df-120e-4f92-a1e3-3def5d61f3d3","Type":"ContainerStarted","Data":"21d1fbcf755be0efdf62de066a7e9e4ff4bde7f734dbd1c90ce42c064bf46ecf"} Dec 05 01:24:00 crc kubenswrapper[4759]: I1205 01:24:00.068131 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" event={"ID":"5ec426df-120e-4f92-a1e3-3def5d61f3d3","Type":"ContainerStarted","Data":"847f8af9410035ccec08504e47c0517fc15e83d825d96733bb8d1701c1c0c743"} Dec 05 01:24:00 crc kubenswrapper[4759]: I1205 01:24:00.094032 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" podStartSLOduration=1.684628671 podStartE2EDuration="2.094004725s" podCreationTimestamp="2025-12-05 01:23:58 +0000 UTC" firstStartedPulling="2025-12-05 01:23:59.057568799 +0000 UTC m=+3658.273229749" lastFinishedPulling="2025-12-05 01:23:59.466944843 +0000 UTC m=+3658.682605803" observedRunningTime="2025-12-05 01:24:00.089036229 +0000 UTC m=+3659.304697179" watchObservedRunningTime="2025-12-05 01:24:00.094004725 +0000 UTC m=+3659.309665715" Dec 05 01:24:07 crc kubenswrapper[4759]: I1205 01:24:07.156809 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:24:07 crc kubenswrapper[4759]: E1205 01:24:07.158504 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:24:22 crc kubenswrapper[4759]: I1205 01:24:22.156151 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:24:22 crc kubenswrapper[4759]: E1205 01:24:22.158143 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.612671 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.618115 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.634624 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.728247 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.729064 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt9kz\" (UniqueName: \"kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.729226 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.830726 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.830881 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.831004 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt9kz\" (UniqueName: \"kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.831262 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.831448 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.849631 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt9kz\" (UniqueName: \"kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz\") pod \"redhat-operators-fstd2\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:31 crc kubenswrapper[4759]: I1205 01:24:31.951771 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:32 crc kubenswrapper[4759]: I1205 01:24:32.453443 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:33 crc kubenswrapper[4759]: I1205 01:24:33.449176 4759 generic.go:334] "Generic (PLEG): container finished" podID="c47baa59-321e-4c84-95c2-acb025481355" containerID="dc08d4297df029917910e483b0d61ec922eb2f3eafa227a9ea348ee997b1e9fb" exitCode=0 Dec 05 01:24:33 crc kubenswrapper[4759]: I1205 01:24:33.449606 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerDied","Data":"dc08d4297df029917910e483b0d61ec922eb2f3eafa227a9ea348ee997b1e9fb"} Dec 05 01:24:33 crc kubenswrapper[4759]: I1205 01:24:33.449647 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerStarted","Data":"cd0a7dd3a4e0668e2a9f50529317d37daca617cc280824a9066b866193260558"} Dec 05 01:24:33 crc kubenswrapper[4759]: I1205 01:24:33.451853 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:24:34 crc kubenswrapper[4759]: I1205 01:24:34.462449 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerStarted","Data":"339d2c7c30617ff03297bb1df8260784e8a8c307e45587eed3062babe96d21e2"} Dec 05 01:24:35 crc kubenswrapper[4759]: I1205 01:24:35.478027 4759 generic.go:334] "Generic (PLEG): container finished" podID="c47baa59-321e-4c84-95c2-acb025481355" containerID="339d2c7c30617ff03297bb1df8260784e8a8c307e45587eed3062babe96d21e2" exitCode=0 Dec 05 01:24:35 crc kubenswrapper[4759]: I1205 01:24:35.478071 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerDied","Data":"339d2c7c30617ff03297bb1df8260784e8a8c307e45587eed3062babe96d21e2"} Dec 05 01:24:36 crc kubenswrapper[4759]: I1205 01:24:36.156245 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:24:36 crc kubenswrapper[4759]: E1205 01:24:36.156522 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:24:38 crc kubenswrapper[4759]: I1205 01:24:38.508735 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerStarted","Data":"09bc93c980a402fd628b7dcfeb0c9b9cb6f72ebd4c9431d7e346a790360950ed"} Dec 05 01:24:38 crc kubenswrapper[4759]: I1205 01:24:38.529025 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fstd2" podStartSLOduration=3.133674484 podStartE2EDuration="7.529009946s" podCreationTimestamp="2025-12-05 01:24:31 +0000 UTC" firstStartedPulling="2025-12-05 01:24:33.451625954 +0000 UTC m=+3692.667286904" lastFinishedPulling="2025-12-05 01:24:37.846961406 +0000 UTC m=+3697.062622366" observedRunningTime="2025-12-05 01:24:38.525402999 +0000 UTC m=+3697.741063939" watchObservedRunningTime="2025-12-05 01:24:38.529009946 +0000 UTC m=+3697.744670896" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.213086 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.219437 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.278100 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.334868 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.335233 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.335508 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw6k\" (UniqueName: \"kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.437400 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.437765 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw6k\" (UniqueName: \"kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.438081 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.438104 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.438520 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.459171 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw6k\" (UniqueName: \"kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k\") pod \"community-operators-mlsgl\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.557821 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.952247 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:41 crc kubenswrapper[4759]: I1205 01:24:41.952816 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:42 crc kubenswrapper[4759]: I1205 01:24:42.146155 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:42 crc kubenswrapper[4759]: I1205 01:24:42.553708 4759 generic.go:334] "Generic (PLEG): container finished" podID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerID="0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b" exitCode=0 Dec 05 01:24:42 crc kubenswrapper[4759]: I1205 01:24:42.553747 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerDied","Data":"0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b"} Dec 05 01:24:42 crc kubenswrapper[4759]: I1205 01:24:42.553772 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerStarted","Data":"1579021495d4daa1b4abc52e9fd42235936bc3dfa92f834566e5c7ecfffb41fc"} Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.002170 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fstd2" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="registry-server" probeResult="failure" output=< Dec 05 01:24:43 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:24:43 crc kubenswrapper[4759]: > Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.609079 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.612053 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.622747 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.684366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.684659 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvjn\" (UniqueName: \"kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.684884 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.786620 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.786795 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.786893 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvjn\" (UniqueName: \"kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.787190 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.787547 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.813398 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvjn\" (UniqueName: \"kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn\") pod \"redhat-marketplace-wbf5k\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:43 crc kubenswrapper[4759]: I1205 01:24:43.928442 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:44 crc kubenswrapper[4759]: I1205 01:24:44.465470 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:44 crc kubenswrapper[4759]: W1205 01:24:44.478421 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15535f7b_aab0_40f7_b862_c7c5f901b1f5.slice/crio-57e808bfbce8769f5d3ae68b8668814c572da15fdd3b0df4b183bc357440f6a2 WatchSource:0}: Error finding container 57e808bfbce8769f5d3ae68b8668814c572da15fdd3b0df4b183bc357440f6a2: Status 404 returned error can't find the container with id 57e808bfbce8769f5d3ae68b8668814c572da15fdd3b0df4b183bc357440f6a2 Dec 05 01:24:44 crc kubenswrapper[4759]: I1205 01:24:44.635003 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerStarted","Data":"57e808bfbce8769f5d3ae68b8668814c572da15fdd3b0df4b183bc357440f6a2"} Dec 05 01:24:44 crc kubenswrapper[4759]: I1205 01:24:44.674610 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerStarted","Data":"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc"} Dec 05 01:24:44 crc kubenswrapper[4759]: E1205 01:24:44.917826 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30d166f9_3ac5_4fce_9771_dab6052c29be.slice/crio-conmon-71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:24:45 crc kubenswrapper[4759]: I1205 01:24:45.688181 4759 generic.go:334] "Generic (PLEG): container finished" podID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerID="0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009" exitCode=0 Dec 05 01:24:45 crc kubenswrapper[4759]: I1205 01:24:45.688378 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerDied","Data":"0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009"} Dec 05 01:24:45 crc kubenswrapper[4759]: I1205 01:24:45.694052 4759 generic.go:334] "Generic (PLEG): container finished" podID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerID="71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc" exitCode=0 Dec 05 01:24:45 crc kubenswrapper[4759]: I1205 01:24:45.694097 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerDied","Data":"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc"} Dec 05 01:24:46 crc kubenswrapper[4759]: I1205 01:24:46.705055 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerStarted","Data":"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a"} Dec 05 01:24:46 crc kubenswrapper[4759]: I1205 01:24:46.713208 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerStarted","Data":"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a"} Dec 05 01:24:46 crc kubenswrapper[4759]: I1205 01:24:46.752266 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mlsgl" podStartSLOduration=2.219918685 podStartE2EDuration="5.752250762s" podCreationTimestamp="2025-12-05 01:24:41 +0000 UTC" firstStartedPulling="2025-12-05 01:24:42.555655085 +0000 UTC m=+3701.771316035" lastFinishedPulling="2025-12-05 01:24:46.087987152 +0000 UTC m=+3705.303648112" observedRunningTime="2025-12-05 01:24:46.747436779 +0000 UTC m=+3705.963097729" watchObservedRunningTime="2025-12-05 01:24:46.752250762 +0000 UTC m=+3705.967911712" Dec 05 01:24:47 crc kubenswrapper[4759]: I1205 01:24:47.727704 4759 generic.go:334] "Generic (PLEG): container finished" podID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerID="6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a" exitCode=0 Dec 05 01:24:47 crc kubenswrapper[4759]: I1205 01:24:47.729587 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerDied","Data":"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a"} Dec 05 01:24:48 crc kubenswrapper[4759]: I1205 01:24:48.742622 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerStarted","Data":"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c"} Dec 05 01:24:48 crc kubenswrapper[4759]: I1205 01:24:48.770206 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wbf5k" podStartSLOduration=3.333316773 podStartE2EDuration="5.77018855s" podCreationTimestamp="2025-12-05 01:24:43 +0000 UTC" firstStartedPulling="2025-12-05 01:24:45.690287628 +0000 UTC m=+3704.905948588" lastFinishedPulling="2025-12-05 01:24:48.127159415 +0000 UTC m=+3707.342820365" observedRunningTime="2025-12-05 01:24:48.768398121 +0000 UTC m=+3707.984059111" watchObservedRunningTime="2025-12-05 01:24:48.77018855 +0000 UTC m=+3707.985849500" Dec 05 01:24:51 crc kubenswrapper[4759]: I1205 01:24:51.165968 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:24:51 crc kubenswrapper[4759]: E1205 01:24:51.166845 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:24:51 crc kubenswrapper[4759]: I1205 01:24:51.559068 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:51 crc kubenswrapper[4759]: I1205 01:24:51.559126 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:51 crc kubenswrapper[4759]: I1205 01:24:51.624176 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:51 crc kubenswrapper[4759]: I1205 01:24:51.879545 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:52 crc kubenswrapper[4759]: I1205 01:24:52.010990 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:52 crc kubenswrapper[4759]: I1205 01:24:52.063578 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:53 crc kubenswrapper[4759]: I1205 01:24:53.806272 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:53 crc kubenswrapper[4759]: I1205 01:24:53.806868 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mlsgl" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="registry-server" containerID="cri-o://2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a" gracePeriod=2 Dec 05 01:24:53 crc kubenswrapper[4759]: I1205 01:24:53.929069 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:53 crc kubenswrapper[4759]: I1205 01:24:53.929401 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:53 crc kubenswrapper[4759]: I1205 01:24:53.992225 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.443117 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.444216 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fstd2" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="registry-server" containerID="cri-o://09bc93c980a402fd628b7dcfeb0c9b9cb6f72ebd4c9431d7e346a790360950ed" gracePeriod=2 Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.460696 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.606137 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities\") pod \"30d166f9-3ac5-4fce-9771-dab6052c29be\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.606408 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsw6k\" (UniqueName: \"kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k\") pod \"30d166f9-3ac5-4fce-9771-dab6052c29be\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.606471 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content\") pod \"30d166f9-3ac5-4fce-9771-dab6052c29be\" (UID: \"30d166f9-3ac5-4fce-9771-dab6052c29be\") " Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.606736 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities" (OuterVolumeSpecName: "utilities") pod "30d166f9-3ac5-4fce-9771-dab6052c29be" (UID: "30d166f9-3ac5-4fce-9771-dab6052c29be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.607042 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.614076 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k" (OuterVolumeSpecName: "kube-api-access-qsw6k") pod "30d166f9-3ac5-4fce-9771-dab6052c29be" (UID: "30d166f9-3ac5-4fce-9771-dab6052c29be"). InnerVolumeSpecName "kube-api-access-qsw6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.665369 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30d166f9-3ac5-4fce-9771-dab6052c29be" (UID: "30d166f9-3ac5-4fce-9771-dab6052c29be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.708617 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsw6k\" (UniqueName: \"kubernetes.io/projected/30d166f9-3ac5-4fce-9771-dab6052c29be-kube-api-access-qsw6k\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.708820 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d166f9-3ac5-4fce-9771-dab6052c29be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.827239 4759 generic.go:334] "Generic (PLEG): container finished" podID="c47baa59-321e-4c84-95c2-acb025481355" containerID="09bc93c980a402fd628b7dcfeb0c9b9cb6f72ebd4c9431d7e346a790360950ed" exitCode=0 Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.827319 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerDied","Data":"09bc93c980a402fd628b7dcfeb0c9b9cb6f72ebd4c9431d7e346a790360950ed"} Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.835097 4759 generic.go:334] "Generic (PLEG): container finished" podID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerID="2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a" exitCode=0 Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.835574 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlsgl" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.835585 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerDied","Data":"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a"} Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.835715 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlsgl" event={"ID":"30d166f9-3ac5-4fce-9771-dab6052c29be","Type":"ContainerDied","Data":"1579021495d4daa1b4abc52e9fd42235936bc3dfa92f834566e5c7ecfffb41fc"} Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.835775 4759 scope.go:117] "RemoveContainer" containerID="2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.880884 4759 scope.go:117] "RemoveContainer" containerID="71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.886947 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.895357 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.906457 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.912374 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mlsgl"] Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.922172 4759 scope.go:117] "RemoveContainer" containerID="0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.975881 4759 scope.go:117] "RemoveContainer" containerID="2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a" Dec 05 01:24:54 crc kubenswrapper[4759]: E1205 01:24:54.976503 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a\": container with ID starting with 2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a not found: ID does not exist" containerID="2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.976563 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a"} err="failed to get container status \"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a\": rpc error: code = NotFound desc = could not find container \"2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a\": container with ID starting with 2649d1e209161a7c04eaef16feee3ceea444294ed9b06821381e16e30fe3b59a not found: ID does not exist" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.976599 4759 scope.go:117] "RemoveContainer" containerID="71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc" Dec 05 01:24:54 crc kubenswrapper[4759]: E1205 01:24:54.977010 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc\": container with ID starting with 71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc not found: ID does not exist" containerID="71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.977071 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc"} err="failed to get container status \"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc\": rpc error: code = NotFound desc = could not find container \"71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc\": container with ID starting with 71323fe3aabf35f6388b2528a6a2adbe00e1ae36673e7d24faf7160a243dd5bc not found: ID does not exist" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.977105 4759 scope.go:117] "RemoveContainer" containerID="0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b" Dec 05 01:24:54 crc kubenswrapper[4759]: E1205 01:24:54.977613 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b\": container with ID starting with 0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b not found: ID does not exist" containerID="0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b" Dec 05 01:24:54 crc kubenswrapper[4759]: I1205 01:24:54.977674 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b"} err="failed to get container status \"0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b\": rpc error: code = NotFound desc = could not find container \"0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b\": container with ID starting with 0fe8d5cb1fb30f145cf584e5c95bf99a9dd086aa37205c88c4137b295701e51b not found: ID does not exist" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.015532 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt9kz\" (UniqueName: \"kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz\") pod \"c47baa59-321e-4c84-95c2-acb025481355\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.015604 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities\") pod \"c47baa59-321e-4c84-95c2-acb025481355\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.015682 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content\") pod \"c47baa59-321e-4c84-95c2-acb025481355\" (UID: \"c47baa59-321e-4c84-95c2-acb025481355\") " Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.017581 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities" (OuterVolumeSpecName: "utilities") pod "c47baa59-321e-4c84-95c2-acb025481355" (UID: "c47baa59-321e-4c84-95c2-acb025481355"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.020822 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz" (OuterVolumeSpecName: "kube-api-access-dt9kz") pod "c47baa59-321e-4c84-95c2-acb025481355" (UID: "c47baa59-321e-4c84-95c2-acb025481355"). InnerVolumeSpecName "kube-api-access-dt9kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.117605 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt9kz\" (UniqueName: \"kubernetes.io/projected/c47baa59-321e-4c84-95c2-acb025481355-kube-api-access-dt9kz\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.117633 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.142748 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c47baa59-321e-4c84-95c2-acb025481355" (UID: "c47baa59-321e-4c84-95c2-acb025481355"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.167419 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" path="/var/lib/kubelet/pods/30d166f9-3ac5-4fce-9771-dab6052c29be/volumes" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.226679 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47baa59-321e-4c84-95c2-acb025481355-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:55 crc kubenswrapper[4759]: E1205 01:24:55.255538 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc47baa59_321e_4c84_95c2_acb025481355.slice/crio-cd0a7dd3a4e0668e2a9f50529317d37daca617cc280824a9066b866193260558\": RecentStats: unable to find data in memory cache]" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.849247 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fstd2" event={"ID":"c47baa59-321e-4c84-95c2-acb025481355","Type":"ContainerDied","Data":"cd0a7dd3a4e0668e2a9f50529317d37daca617cc280824a9066b866193260558"} Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.849677 4759 scope.go:117] "RemoveContainer" containerID="09bc93c980a402fd628b7dcfeb0c9b9cb6f72ebd4c9431d7e346a790360950ed" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.849300 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fstd2" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.880525 4759 scope.go:117] "RemoveContainer" containerID="339d2c7c30617ff03297bb1df8260784e8a8c307e45587eed3062babe96d21e2" Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.882093 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.893500 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fstd2"] Dec 05 01:24:55 crc kubenswrapper[4759]: I1205 01:24:55.919703 4759 scope.go:117] "RemoveContainer" containerID="dc08d4297df029917910e483b0d61ec922eb2f3eafa227a9ea348ee997b1e9fb" Dec 05 01:24:56 crc kubenswrapper[4759]: I1205 01:24:56.822272 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:57 crc kubenswrapper[4759]: I1205 01:24:57.169754 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47baa59-321e-4c84-95c2-acb025481355" path="/var/lib/kubelet/pods/c47baa59-321e-4c84-95c2-acb025481355/volumes" Dec 05 01:24:57 crc kubenswrapper[4759]: I1205 01:24:57.874482 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wbf5k" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="registry-server" containerID="cri-o://9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c" gracePeriod=2 Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.346142 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.508204 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities\") pod \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.508513 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvjn\" (UniqueName: \"kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn\") pod \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.508545 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content\") pod \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\" (UID: \"15535f7b-aab0-40f7-b862-c7c5f901b1f5\") " Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.509485 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities" (OuterVolumeSpecName: "utilities") pod "15535f7b-aab0-40f7-b862-c7c5f901b1f5" (UID: "15535f7b-aab0-40f7-b862-c7c5f901b1f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.515272 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn" (OuterVolumeSpecName: "kube-api-access-6lvjn") pod "15535f7b-aab0-40f7-b862-c7c5f901b1f5" (UID: "15535f7b-aab0-40f7-b862-c7c5f901b1f5"). InnerVolumeSpecName "kube-api-access-6lvjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.529073 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15535f7b-aab0-40f7-b862-c7c5f901b1f5" (UID: "15535f7b-aab0-40f7-b862-c7c5f901b1f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.613328 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lvjn\" (UniqueName: \"kubernetes.io/projected/15535f7b-aab0-40f7-b862-c7c5f901b1f5-kube-api-access-6lvjn\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.613694 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.613710 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15535f7b-aab0-40f7-b862-c7c5f901b1f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.888717 4759 generic.go:334] "Generic (PLEG): container finished" podID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerID="9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c" exitCode=0 Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.888760 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerDied","Data":"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c"} Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.888794 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbf5k" event={"ID":"15535f7b-aab0-40f7-b862-c7c5f901b1f5","Type":"ContainerDied","Data":"57e808bfbce8769f5d3ae68b8668814c572da15fdd3b0df4b183bc357440f6a2"} Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.888811 4759 scope.go:117] "RemoveContainer" containerID="9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.888821 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbf5k" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.916899 4759 scope.go:117] "RemoveContainer" containerID="6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.965941 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.973846 4759 scope.go:117] "RemoveContainer" containerID="0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009" Dec 05 01:24:58 crc kubenswrapper[4759]: I1205 01:24:58.986063 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbf5k"] Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.027516 4759 scope.go:117] "RemoveContainer" containerID="9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c" Dec 05 01:24:59 crc kubenswrapper[4759]: E1205 01:24:59.029196 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c\": container with ID starting with 9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c not found: ID does not exist" containerID="9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.029537 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c"} err="failed to get container status \"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c\": rpc error: code = NotFound desc = could not find container \"9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c\": container with ID starting with 9f4df5c607fa8b5559dae10e74c525baca8bfa36a15d3a91b468c52cf404658c not found: ID does not exist" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.029570 4759 scope.go:117] "RemoveContainer" containerID="6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a" Dec 05 01:24:59 crc kubenswrapper[4759]: E1205 01:24:59.029907 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a\": container with ID starting with 6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a not found: ID does not exist" containerID="6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.029935 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a"} err="failed to get container status \"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a\": rpc error: code = NotFound desc = could not find container \"6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a\": container with ID starting with 6a00ae66e08ad7cdcf65710c1c31b4d7e3c45b6a0ccb6b5dade3e3813bfdb22a not found: ID does not exist" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.030082 4759 scope.go:117] "RemoveContainer" containerID="0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009" Dec 05 01:24:59 crc kubenswrapper[4759]: E1205 01:24:59.034179 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009\": container with ID starting with 0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009 not found: ID does not exist" containerID="0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.034235 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009"} err="failed to get container status \"0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009\": rpc error: code = NotFound desc = could not find container \"0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009\": container with ID starting with 0abe178f0b0efc629ec769d4c2c9852748b579a67e8ee763f9156def17144009 not found: ID does not exist" Dec 05 01:24:59 crc kubenswrapper[4759]: I1205 01:24:59.167250 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" path="/var/lib/kubelet/pods/15535f7b-aab0-40f7-b862-c7c5f901b1f5/volumes" Dec 05 01:25:02 crc kubenswrapper[4759]: I1205 01:25:02.156261 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:25:02 crc kubenswrapper[4759]: E1205 01:25:02.156984 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:25:14 crc kubenswrapper[4759]: I1205 01:25:14.156365 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:25:14 crc kubenswrapper[4759]: E1205 01:25:14.157383 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:25:26 crc kubenswrapper[4759]: I1205 01:25:26.156752 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:25:26 crc kubenswrapper[4759]: E1205 01:25:26.157662 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:25:27 crc kubenswrapper[4759]: I1205 01:25:27.217191 4759 generic.go:334] "Generic (PLEG): container finished" podID="5ec426df-120e-4f92-a1e3-3def5d61f3d3" containerID="21d1fbcf755be0efdf62de066a7e9e4ff4bde7f734dbd1c90ce42c064bf46ecf" exitCode=0 Dec 05 01:25:27 crc kubenswrapper[4759]: I1205 01:25:27.217287 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" event={"ID":"5ec426df-120e-4f92-a1e3-3def5d61f3d3","Type":"ContainerDied","Data":"21d1fbcf755be0efdf62de066a7e9e4ff4bde7f734dbd1c90ce42c064bf46ecf"} Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.709920 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.741724 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.741943 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f95ms\" (UniqueName: \"kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.742137 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.742384 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.742664 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.742790 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph\") pod \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\" (UID: \"5ec426df-120e-4f92-a1e3-3def5d61f3d3\") " Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.755138 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph" (OuterVolumeSpecName: "ceph") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.756596 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.762338 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms" (OuterVolumeSpecName: "kube-api-access-f95ms") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "kube-api-access-f95ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.792375 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.799268 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.802644 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory" (OuterVolumeSpecName: "inventory") pod "5ec426df-120e-4f92-a1e3-3def5d61f3d3" (UID: "5ec426df-120e-4f92-a1e3-3def5d61f3d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847166 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f95ms\" (UniqueName: \"kubernetes.io/projected/5ec426df-120e-4f92-a1e3-3def5d61f3d3-kube-api-access-f95ms\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847411 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847526 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847609 4759 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847681 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:28 crc kubenswrapper[4759]: I1205 01:25:28.847750 4759 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ec426df-120e-4f92-a1e3-3def5d61f3d3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.243001 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" event={"ID":"5ec426df-120e-4f92-a1e3-3def5d61f3d3","Type":"ContainerDied","Data":"847f8af9410035ccec08504e47c0517fc15e83d825d96733bb8d1701c1c0c743"} Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.243054 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="847f8af9410035ccec08504e47c0517fc15e83d825d96733bb8d1701c1c0c743" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.243329 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qc2t8" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.395598 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2"] Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396243 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396273 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396294 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec426df-120e-4f92-a1e3-3def5d61f3d3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396354 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec426df-120e-4f92-a1e3-3def5d61f3d3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396380 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396397 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396427 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396444 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396476 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396491 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396542 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396555 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396592 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396605 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396629 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396642 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="extract-utilities" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396668 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396680 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: E1205 01:25:29.396713 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.396725 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="extract-content" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.397121 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="15535f7b-aab0-40f7-b862-c7c5f901b1f5" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.397166 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec426df-120e-4f92-a1e3-3def5d61f3d3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.397194 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d166f9-3ac5-4fce-9771-dab6052c29be" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.397221 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47baa59-321e-4c84-95c2-acb025481355" containerName="registry-server" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.398826 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.401228 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.401586 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.402302 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.402585 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.402893 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.403106 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.404055 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.406164 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2"] Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.460435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.460565 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.460771 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.460870 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.461021 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.461137 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjm5\" (UniqueName: \"kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.461533 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.563956 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.563998 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.564024 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.564050 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.564115 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.564145 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjm5\" (UniqueName: \"kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.564187 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.568396 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.570619 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.570834 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.571182 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.578899 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.581512 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjm5\" (UniqueName: \"kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.581553 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:29 crc kubenswrapper[4759]: I1205 01:25:29.732442 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:25:30 crc kubenswrapper[4759]: I1205 01:25:30.328108 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2"] Dec 05 01:25:31 crc kubenswrapper[4759]: I1205 01:25:31.270068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" event={"ID":"dad31a51-d010-4c0a-b52f-022acdb7d893","Type":"ContainerStarted","Data":"bf1bbdf4651e7e86ec1ddc33544a99be772788d12ec8cec837a388d6782c58f5"} Dec 05 01:25:31 crc kubenswrapper[4759]: I1205 01:25:31.270400 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" event={"ID":"dad31a51-d010-4c0a-b52f-022acdb7d893","Type":"ContainerStarted","Data":"f7a2ce65dab38e0c46827ee80a5c65ce380200bd624b8fad3a2ef33d37a84750"} Dec 05 01:25:31 crc kubenswrapper[4759]: I1205 01:25:31.303117 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" podStartSLOduration=1.839341443 podStartE2EDuration="2.303087011s" podCreationTimestamp="2025-12-05 01:25:29 +0000 UTC" firstStartedPulling="2025-12-05 01:25:30.332446822 +0000 UTC m=+3749.548107772" lastFinishedPulling="2025-12-05 01:25:30.79619239 +0000 UTC m=+3750.011853340" observedRunningTime="2025-12-05 01:25:31.299533345 +0000 UTC m=+3750.515194325" watchObservedRunningTime="2025-12-05 01:25:31.303087011 +0000 UTC m=+3750.518747991" Dec 05 01:25:39 crc kubenswrapper[4759]: I1205 01:25:39.156425 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:25:39 crc kubenswrapper[4759]: E1205 01:25:39.157658 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:25:52 crc kubenswrapper[4759]: I1205 01:25:52.156589 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:25:52 crc kubenswrapper[4759]: E1205 01:25:52.157655 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:06 crc kubenswrapper[4759]: I1205 01:26:06.156911 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:26:06 crc kubenswrapper[4759]: E1205 01:26:06.158561 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:20 crc kubenswrapper[4759]: I1205 01:26:20.156244 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:26:20 crc kubenswrapper[4759]: E1205 01:26:20.158147 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:31 crc kubenswrapper[4759]: I1205 01:26:31.165125 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:26:31 crc kubenswrapper[4759]: E1205 01:26:31.166188 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:43 crc kubenswrapper[4759]: I1205 01:26:43.156685 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:26:43 crc kubenswrapper[4759]: E1205 01:26:43.157826 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:54 crc kubenswrapper[4759]: I1205 01:26:54.157439 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:26:54 crc kubenswrapper[4759]: E1205 01:26:54.158580 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:26:55 crc kubenswrapper[4759]: I1205 01:26:55.369809 4759 generic.go:334] "Generic (PLEG): container finished" podID="dad31a51-d010-4c0a-b52f-022acdb7d893" containerID="bf1bbdf4651e7e86ec1ddc33544a99be772788d12ec8cec837a388d6782c58f5" exitCode=0 Dec 05 01:26:55 crc kubenswrapper[4759]: I1205 01:26:55.369913 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" event={"ID":"dad31a51-d010-4c0a-b52f-022acdb7d893","Type":"ContainerDied","Data":"bf1bbdf4651e7e86ec1ddc33544a99be772788d12ec8cec837a388d6782c58f5"} Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.845536 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.904518 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.904944 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.904993 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.905067 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjm5\" (UniqueName: \"kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.905162 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.905249 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.905340 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory\") pod \"dad31a51-d010-4c0a-b52f-022acdb7d893\" (UID: \"dad31a51-d010-4c0a-b52f-022acdb7d893\") " Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.912161 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.913660 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph" (OuterVolumeSpecName: "ceph") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.915819 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5" (OuterVolumeSpecName: "kube-api-access-pdjm5") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "kube-api-access-pdjm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.944419 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.947133 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.950659 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory" (OuterVolumeSpecName: "inventory") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:56 crc kubenswrapper[4759]: I1205 01:26:56.953720 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dad31a51-d010-4c0a-b52f-022acdb7d893" (UID: "dad31a51-d010-4c0a-b52f-022acdb7d893"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008624 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008671 4759 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008689 4759 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008706 4759 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008723 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjm5\" (UniqueName: \"kubernetes.io/projected/dad31a51-d010-4c0a-b52f-022acdb7d893-kube-api-access-pdjm5\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008735 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.008749 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dad31a51-d010-4c0a-b52f-022acdb7d893-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.390637 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" event={"ID":"dad31a51-d010-4c0a-b52f-022acdb7d893","Type":"ContainerDied","Data":"f7a2ce65dab38e0c46827ee80a5c65ce380200bd624b8fad3a2ef33d37a84750"} Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.390691 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a2ce65dab38e0c46827ee80a5c65ce380200bd624b8fad3a2ef33d37a84750" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.390721 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.526785 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q"] Dec 05 01:26:57 crc kubenswrapper[4759]: E1205 01:26:57.527345 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad31a51-d010-4c0a-b52f-022acdb7d893" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.527363 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad31a51-d010-4c0a-b52f-022acdb7d893" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.527573 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad31a51-d010-4c0a-b52f-022acdb7d893" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.528295 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.530587 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.530942 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.531109 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.531453 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.532017 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.532538 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.535533 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q"] Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.619547 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.620081 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.620158 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.620233 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.620263 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75lc5\" (UniqueName: \"kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.620508 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723041 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723214 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723446 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723571 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723624 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75lc5\" (UniqueName: \"kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.723728 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.727783 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.728069 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.729865 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.730875 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.740195 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.748785 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75lc5\" (UniqueName: \"kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:57 crc kubenswrapper[4759]: I1205 01:26:57.851457 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:26:58 crc kubenswrapper[4759]: I1205 01:26:58.503667 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q"] Dec 05 01:26:59 crc kubenswrapper[4759]: I1205 01:26:59.415007 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" event={"ID":"b68949f6-0ba5-476a-8ff1-9b4247fe99e8","Type":"ContainerStarted","Data":"2c179a60b44fc2099f9b18f5d601c6c229b0ab541b73079f4bfc2b04488c3059"} Dec 05 01:26:59 crc kubenswrapper[4759]: I1205 01:26:59.415281 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" event={"ID":"b68949f6-0ba5-476a-8ff1-9b4247fe99e8","Type":"ContainerStarted","Data":"969437fef001ea04605262d33046e464e125ba34ddeb008338ab9b7641b05ee5"} Dec 05 01:26:59 crc kubenswrapper[4759]: I1205 01:26:59.434404 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" podStartSLOduration=1.957152652 podStartE2EDuration="2.434371621s" podCreationTimestamp="2025-12-05 01:26:57 +0000 UTC" firstStartedPulling="2025-12-05 01:26:58.522515927 +0000 UTC m=+3837.738176877" lastFinishedPulling="2025-12-05 01:26:58.999734896 +0000 UTC m=+3838.215395846" observedRunningTime="2025-12-05 01:26:59.432145077 +0000 UTC m=+3838.647806057" watchObservedRunningTime="2025-12-05 01:26:59.434371621 +0000 UTC m=+3838.650032571" Dec 05 01:27:05 crc kubenswrapper[4759]: I1205 01:27:05.156563 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:27:05 crc kubenswrapper[4759]: E1205 01:27:05.157546 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:27:16 crc kubenswrapper[4759]: I1205 01:27:16.156615 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:27:16 crc kubenswrapper[4759]: E1205 01:27:16.157647 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:27:28 crc kubenswrapper[4759]: I1205 01:27:28.155749 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:27:28 crc kubenswrapper[4759]: E1205 01:27:28.156672 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:27:42 crc kubenswrapper[4759]: I1205 01:27:42.156684 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:27:43 crc kubenswrapper[4759]: I1205 01:27:43.963338 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b"} Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.674623 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.684076 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.697874 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.771047 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.771151 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq929\" (UniqueName: \"kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.771345 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.872766 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.872872 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq929\" (UniqueName: \"kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.873019 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.873360 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.873523 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:39 crc kubenswrapper[4759]: I1205 01:28:39.898878 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq929\" (UniqueName: \"kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929\") pod \"certified-operators-cfkhs\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:40 crc kubenswrapper[4759]: I1205 01:28:40.018517 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:40 crc kubenswrapper[4759]: I1205 01:28:40.587603 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:40 crc kubenswrapper[4759]: I1205 01:28:40.626402 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerStarted","Data":"2aa4de394c14a024fa8e9f6ef3f29bcf8e0f5221606c4e491095433b49913de9"} Dec 05 01:28:41 crc kubenswrapper[4759]: I1205 01:28:41.659062 4759 generic.go:334] "Generic (PLEG): container finished" podID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerID="d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc" exitCode=0 Dec 05 01:28:41 crc kubenswrapper[4759]: I1205 01:28:41.659130 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerDied","Data":"d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc"} Dec 05 01:28:42 crc kubenswrapper[4759]: I1205 01:28:42.672882 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerStarted","Data":"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd"} Dec 05 01:28:44 crc kubenswrapper[4759]: I1205 01:28:44.697334 4759 generic.go:334] "Generic (PLEG): container finished" podID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerID="a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd" exitCode=0 Dec 05 01:28:44 crc kubenswrapper[4759]: I1205 01:28:44.697668 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerDied","Data":"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd"} Dec 05 01:28:45 crc kubenswrapper[4759]: I1205 01:28:45.722674 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerStarted","Data":"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289"} Dec 05 01:28:45 crc kubenswrapper[4759]: I1205 01:28:45.756746 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cfkhs" podStartSLOduration=3.32702644 podStartE2EDuration="6.756722261s" podCreationTimestamp="2025-12-05 01:28:39 +0000 UTC" firstStartedPulling="2025-12-05 01:28:41.668024957 +0000 UTC m=+3940.883685947" lastFinishedPulling="2025-12-05 01:28:45.097720768 +0000 UTC m=+3944.313381768" observedRunningTime="2025-12-05 01:28:45.7408951 +0000 UTC m=+3944.956556070" watchObservedRunningTime="2025-12-05 01:28:45.756722261 +0000 UTC m=+3944.972383211" Dec 05 01:28:50 crc kubenswrapper[4759]: I1205 01:28:50.018712 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:50 crc kubenswrapper[4759]: I1205 01:28:50.019404 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:50 crc kubenswrapper[4759]: I1205 01:28:50.083142 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:50 crc kubenswrapper[4759]: I1205 01:28:50.866871 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:50 crc kubenswrapper[4759]: I1205 01:28:50.923655 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:52 crc kubenswrapper[4759]: I1205 01:28:52.820199 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cfkhs" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="registry-server" containerID="cri-o://a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289" gracePeriod=2 Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.429227 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.608718 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq929\" (UniqueName: \"kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929\") pod \"13cd0844-fbc0-4bd5-8093-f245527a41a5\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.608942 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities\") pod \"13cd0844-fbc0-4bd5-8093-f245527a41a5\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.609005 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content\") pod \"13cd0844-fbc0-4bd5-8093-f245527a41a5\" (UID: \"13cd0844-fbc0-4bd5-8093-f245527a41a5\") " Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.610959 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities" (OuterVolumeSpecName: "utilities") pod "13cd0844-fbc0-4bd5-8093-f245527a41a5" (UID: "13cd0844-fbc0-4bd5-8093-f245527a41a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.617152 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929" (OuterVolumeSpecName: "kube-api-access-tq929") pod "13cd0844-fbc0-4bd5-8093-f245527a41a5" (UID: "13cd0844-fbc0-4bd5-8093-f245527a41a5"). InnerVolumeSpecName "kube-api-access-tq929". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.669750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13cd0844-fbc0-4bd5-8093-f245527a41a5" (UID: "13cd0844-fbc0-4bd5-8093-f245527a41a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.711089 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq929\" (UniqueName: \"kubernetes.io/projected/13cd0844-fbc0-4bd5-8093-f245527a41a5-kube-api-access-tq929\") on node \"crc\" DevicePath \"\"" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.711123 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.711133 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13cd0844-fbc0-4bd5-8093-f245527a41a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.834409 4759 generic.go:334] "Generic (PLEG): container finished" podID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerID="a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289" exitCode=0 Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.834473 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerDied","Data":"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289"} Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.834540 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfkhs" event={"ID":"13cd0844-fbc0-4bd5-8093-f245527a41a5","Type":"ContainerDied","Data":"2aa4de394c14a024fa8e9f6ef3f29bcf8e0f5221606c4e491095433b49913de9"} Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.834565 4759 scope.go:117] "RemoveContainer" containerID="a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.834501 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfkhs" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.877680 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.881645 4759 scope.go:117] "RemoveContainer" containerID="a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd" Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.889458 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cfkhs"] Dec 05 01:28:53 crc kubenswrapper[4759]: I1205 01:28:53.904806 4759 scope.go:117] "RemoveContainer" containerID="d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.007684 4759 scope.go:117] "RemoveContainer" containerID="a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289" Dec 05 01:28:54 crc kubenswrapper[4759]: E1205 01:28:54.008069 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289\": container with ID starting with a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289 not found: ID does not exist" containerID="a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.008127 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289"} err="failed to get container status \"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289\": rpc error: code = NotFound desc = could not find container \"a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289\": container with ID starting with a4f7ed5b4013bf1a62e1b8dde2f8afbca20be4f5391264310c2099c5f2f59289 not found: ID does not exist" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.008156 4759 scope.go:117] "RemoveContainer" containerID="a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd" Dec 05 01:28:54 crc kubenswrapper[4759]: E1205 01:28:54.008620 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd\": container with ID starting with a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd not found: ID does not exist" containerID="a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.008649 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd"} err="failed to get container status \"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd\": rpc error: code = NotFound desc = could not find container \"a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd\": container with ID starting with a698d46d8d5921e977f6db9d2704203ec9c482f184cc68797a234e216b00e4bd not found: ID does not exist" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.008682 4759 scope.go:117] "RemoveContainer" containerID="d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc" Dec 05 01:28:54 crc kubenswrapper[4759]: E1205 01:28:54.009220 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc\": container with ID starting with d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc not found: ID does not exist" containerID="d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc" Dec 05 01:28:54 crc kubenswrapper[4759]: I1205 01:28:54.009259 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc"} err="failed to get container status \"d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc\": rpc error: code = NotFound desc = could not find container \"d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc\": container with ID starting with d0c32efcbc3a3c04eb030addef72d3549163f00474927debe56b4df80220a4fc not found: ID does not exist" Dec 05 01:28:55 crc kubenswrapper[4759]: I1205 01:28:55.188029 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" path="/var/lib/kubelet/pods/13cd0844-fbc0-4bd5-8093-f245527a41a5/volumes" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.175701 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx"] Dec 05 01:30:00 crc kubenswrapper[4759]: E1205 01:30:00.176800 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="registry-server" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.176817 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="registry-server" Dec 05 01:30:00 crc kubenswrapper[4759]: E1205 01:30:00.176854 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="extract-content" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.176865 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="extract-content" Dec 05 01:30:00 crc kubenswrapper[4759]: E1205 01:30:00.176947 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="extract-utilities" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.176958 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="extract-utilities" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.177244 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cd0844-fbc0-4bd5-8093-f245527a41a5" containerName="registry-server" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.178162 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.182839 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.190213 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx"] Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.192208 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.213051 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.213109 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.213194 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7mn\" (UniqueName: \"kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.315698 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.315756 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.315814 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7mn\" (UniqueName: \"kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.317284 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.330103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.332498 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7mn\" (UniqueName: \"kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn\") pod \"collect-profiles-29414970-wnjpx\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:00 crc kubenswrapper[4759]: I1205 01:30:00.508967 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:01 crc kubenswrapper[4759]: I1205 01:30:01.048486 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx"] Dec 05 01:30:01 crc kubenswrapper[4759]: I1205 01:30:01.659781 4759 generic.go:334] "Generic (PLEG): container finished" podID="1b1b96e7-6a9d-4297-8bb8-788206857735" containerID="346ad81de72ebcd144e403f017220963b13430557eecdf2bc3468ec12c853ec1" exitCode=0 Dec 05 01:30:01 crc kubenswrapper[4759]: I1205 01:30:01.659853 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" event={"ID":"1b1b96e7-6a9d-4297-8bb8-788206857735","Type":"ContainerDied","Data":"346ad81de72ebcd144e403f017220963b13430557eecdf2bc3468ec12c853ec1"} Dec 05 01:30:01 crc kubenswrapper[4759]: I1205 01:30:01.660842 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" event={"ID":"1b1b96e7-6a9d-4297-8bb8-788206857735","Type":"ContainerStarted","Data":"03a4fc47665eefcf752de0ee2eba3a93c82fe95599e3e4b7f6bef6f1c978d408"} Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.171019 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.280876 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume\") pod \"1b1b96e7-6a9d-4297-8bb8-788206857735\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.281076 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume\") pod \"1b1b96e7-6a9d-4297-8bb8-788206857735\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.281226 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7mn\" (UniqueName: \"kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn\") pod \"1b1b96e7-6a9d-4297-8bb8-788206857735\" (UID: \"1b1b96e7-6a9d-4297-8bb8-788206857735\") " Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.281658 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b1b96e7-6a9d-4297-8bb8-788206857735" (UID: "1b1b96e7-6a9d-4297-8bb8-788206857735"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.293685 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b1b96e7-6a9d-4297-8bb8-788206857735" (UID: "1b1b96e7-6a9d-4297-8bb8-788206857735"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.293717 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn" (OuterVolumeSpecName: "kube-api-access-bk7mn") pod "1b1b96e7-6a9d-4297-8bb8-788206857735" (UID: "1b1b96e7-6a9d-4297-8bb8-788206857735"). InnerVolumeSpecName "kube-api-access-bk7mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.383280 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b1b96e7-6a9d-4297-8bb8-788206857735-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.383329 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7mn\" (UniqueName: \"kubernetes.io/projected/1b1b96e7-6a9d-4297-8bb8-788206857735-kube-api-access-bk7mn\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.383339 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b1b96e7-6a9d-4297-8bb8-788206857735-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.679929 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" event={"ID":"1b1b96e7-6a9d-4297-8bb8-788206857735","Type":"ContainerDied","Data":"03a4fc47665eefcf752de0ee2eba3a93c82fe95599e3e4b7f6bef6f1c978d408"} Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.679972 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx" Dec 05 01:30:03 crc kubenswrapper[4759]: I1205 01:30:03.679973 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a4fc47665eefcf752de0ee2eba3a93c82fe95599e3e4b7f6bef6f1c978d408" Dec 05 01:30:04 crc kubenswrapper[4759]: I1205 01:30:04.259033 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl"] Dec 05 01:30:04 crc kubenswrapper[4759]: I1205 01:30:04.268954 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414925-qqvbl"] Dec 05 01:30:04 crc kubenswrapper[4759]: I1205 01:30:04.446287 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:30:04 crc kubenswrapper[4759]: I1205 01:30:04.446378 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:30:05 crc kubenswrapper[4759]: I1205 01:30:05.219857 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294c868c-f73e-4ef5-b9ac-359751888700" path="/var/lib/kubelet/pods/294c868c-f73e-4ef5-b9ac-359751888700/volumes" Dec 05 01:30:31 crc kubenswrapper[4759]: I1205 01:30:31.612396 4759 scope.go:117] "RemoveContainer" containerID="c8d12ff814c7fe7114b6519df703ddefdb3289797ec33d4d7596113349a4d821" Dec 05 01:30:34 crc kubenswrapper[4759]: I1205 01:30:34.432871 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:30:34 crc kubenswrapper[4759]: I1205 01:30:34.433477 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:30:44 crc kubenswrapper[4759]: I1205 01:30:44.109918 4759 generic.go:334] "Generic (PLEG): container finished" podID="b68949f6-0ba5-476a-8ff1-9b4247fe99e8" containerID="2c179a60b44fc2099f9b18f5d601c6c229b0ab541b73079f4bfc2b04488c3059" exitCode=0 Dec 05 01:30:44 crc kubenswrapper[4759]: I1205 01:30:44.110023 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" event={"ID":"b68949f6-0ba5-476a-8ff1-9b4247fe99e8","Type":"ContainerDied","Data":"2c179a60b44fc2099f9b18f5d601c6c229b0ab541b73079f4bfc2b04488c3059"} Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.651561 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.740923 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75lc5\" (UniqueName: \"kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.740993 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.741162 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.741253 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.741323 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.741397 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle\") pod \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\" (UID: \"b68949f6-0ba5-476a-8ff1-9b4247fe99e8\") " Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.747938 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.748286 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph" (OuterVolumeSpecName: "ceph") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.748406 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5" (OuterVolumeSpecName: "kube-api-access-75lc5") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "kube-api-access-75lc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.774171 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.779946 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory" (OuterVolumeSpecName: "inventory") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.782649 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b68949f6-0ba5-476a-8ff1-9b4247fe99e8" (UID: "b68949f6-0ba5-476a-8ff1-9b4247fe99e8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846805 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846864 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846881 4759 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846893 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75lc5\" (UniqueName: \"kubernetes.io/projected/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-kube-api-access-75lc5\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846902 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:45 crc kubenswrapper[4759]: I1205 01:30:45.846915 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b68949f6-0ba5-476a-8ff1-9b4247fe99e8-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.160377 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" event={"ID":"b68949f6-0ba5-476a-8ff1-9b4247fe99e8","Type":"ContainerDied","Data":"969437fef001ea04605262d33046e464e125ba34ddeb008338ab9b7641b05ee5"} Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.160414 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969437fef001ea04605262d33046e464e125ba34ddeb008338ab9b7641b05ee5" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.160442 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.300396 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j"] Dec 05 01:30:46 crc kubenswrapper[4759]: E1205 01:30:46.300955 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68949f6-0ba5-476a-8ff1-9b4247fe99e8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.300982 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68949f6-0ba5-476a-8ff1-9b4247fe99e8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:30:46 crc kubenswrapper[4759]: E1205 01:30:46.301022 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1b96e7-6a9d-4297-8bb8-788206857735" containerName="collect-profiles" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.301032 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1b96e7-6a9d-4297-8bb8-788206857735" containerName="collect-profiles" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.301286 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68949f6-0ba5-476a-8ff1-9b4247fe99e8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.301353 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1b96e7-6a9d-4297-8bb8-788206857735" containerName="collect-profiles" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.302212 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j"] Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.302329 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.307888 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.307888 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308379 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308400 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308436 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308670 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308721 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.308870 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.309021 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.471384 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.471775 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.471840 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wzm\" (UniqueName: \"kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.471949 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.471979 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472122 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472266 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472356 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472401 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472522 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.472627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574278 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574407 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574490 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574563 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574670 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574754 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574836 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wzm\" (UniqueName: \"kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.574994 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.575045 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.575115 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.575226 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.576000 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.576707 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.579377 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.579729 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.580728 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.580833 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.580922 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.581098 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.581349 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.582894 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.595373 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wzm\" (UniqueName: \"kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:46 crc kubenswrapper[4759]: I1205 01:30:46.629916 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:30:47 crc kubenswrapper[4759]: I1205 01:30:47.275060 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j"] Dec 05 01:30:47 crc kubenswrapper[4759]: W1205 01:30:47.279154 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fac4138_c163_4a29_b1b0_b78285e908ec.slice/crio-9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1 WatchSource:0}: Error finding container 9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1: Status 404 returned error can't find the container with id 9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1 Dec 05 01:30:47 crc kubenswrapper[4759]: I1205 01:30:47.282090 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:30:48 crc kubenswrapper[4759]: I1205 01:30:48.194826 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" event={"ID":"3fac4138-c163-4a29-b1b0-b78285e908ec","Type":"ContainerStarted","Data":"2ed6f5d5410c4af0efe59ee839142b25c699041fc5adbf3bc37fb5975773c60c"} Dec 05 01:30:48 crc kubenswrapper[4759]: I1205 01:30:48.195345 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" event={"ID":"3fac4138-c163-4a29-b1b0-b78285e908ec","Type":"ContainerStarted","Data":"9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1"} Dec 05 01:30:48 crc kubenswrapper[4759]: I1205 01:30:48.218829 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" podStartSLOduration=1.705209713 podStartE2EDuration="2.218804909s" podCreationTimestamp="2025-12-05 01:30:46 +0000 UTC" firstStartedPulling="2025-12-05 01:30:47.281893047 +0000 UTC m=+4066.497554007" lastFinishedPulling="2025-12-05 01:30:47.795488253 +0000 UTC m=+4067.011149203" observedRunningTime="2025-12-05 01:30:48.217226121 +0000 UTC m=+4067.432887091" watchObservedRunningTime="2025-12-05 01:30:48.218804909 +0000 UTC m=+4067.434465879" Dec 05 01:31:04 crc kubenswrapper[4759]: I1205 01:31:04.433040 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:31:04 crc kubenswrapper[4759]: I1205 01:31:04.433781 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:31:04 crc kubenswrapper[4759]: I1205 01:31:04.433842 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:31:04 crc kubenswrapper[4759]: I1205 01:31:04.434840 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:31:04 crc kubenswrapper[4759]: I1205 01:31:04.434910 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b" gracePeriod=600 Dec 05 01:31:05 crc kubenswrapper[4759]: I1205 01:31:05.415051 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b" exitCode=0 Dec 05 01:31:05 crc kubenswrapper[4759]: I1205 01:31:05.415140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b"} Dec 05 01:31:05 crc kubenswrapper[4759]: I1205 01:31:05.415564 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc"} Dec 05 01:31:05 crc kubenswrapper[4759]: I1205 01:31:05.415597 4759 scope.go:117] "RemoveContainer" containerID="5947bddad25c854ce710ee57d1cfc9c4647cf3be0ea03c0661446b3f6d8dde5d" Dec 05 01:33:04 crc kubenswrapper[4759]: I1205 01:33:04.433790 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:33:04 crc kubenswrapper[4759]: I1205 01:33:04.434582 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:33:34 crc kubenswrapper[4759]: I1205 01:33:34.433062 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:33:34 crc kubenswrapper[4759]: I1205 01:33:34.433736 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:34:04 crc kubenswrapper[4759]: I1205 01:34:04.433570 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:34:04 crc kubenswrapper[4759]: I1205 01:34:04.434354 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:34:04 crc kubenswrapper[4759]: I1205 01:34:04.434427 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:34:04 crc kubenswrapper[4759]: I1205 01:34:04.435563 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:34:04 crc kubenswrapper[4759]: I1205 01:34:04.435654 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" gracePeriod=600 Dec 05 01:34:04 crc kubenswrapper[4759]: E1205 01:34:04.560738 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:05 crc kubenswrapper[4759]: I1205 01:34:05.472736 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" exitCode=0 Dec 05 01:34:05 crc kubenswrapper[4759]: I1205 01:34:05.472792 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc"} Dec 05 01:34:05 crc kubenswrapper[4759]: I1205 01:34:05.473161 4759 scope.go:117] "RemoveContainer" containerID="fb49e48850188d79d26e2db4e66ad4fe74e212b94a7f5c6f7c1f65063010de3b" Dec 05 01:34:05 crc kubenswrapper[4759]: I1205 01:34:05.473993 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:34:05 crc kubenswrapper[4759]: E1205 01:34:05.474359 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:19 crc kubenswrapper[4759]: I1205 01:34:19.159491 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:34:19 crc kubenswrapper[4759]: E1205 01:34:19.162843 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:30 crc kubenswrapper[4759]: I1205 01:34:30.155480 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:34:30 crc kubenswrapper[4759]: E1205 01:34:30.156262 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:45 crc kubenswrapper[4759]: I1205 01:34:45.156749 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:34:45 crc kubenswrapper[4759]: E1205 01:34:45.197455 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:57 crc kubenswrapper[4759]: I1205 01:34:57.156110 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:34:57 crc kubenswrapper[4759]: E1205 01:34:57.156996 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:34:57 crc kubenswrapper[4759]: I1205 01:34:57.895553 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:34:57 crc kubenswrapper[4759]: I1205 01:34:57.899200 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:57 crc kubenswrapper[4759]: I1205 01:34:57.916440 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.032464 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.032527 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.032593 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs4t7\" (UniqueName: \"kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.134001 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.134277 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.134332 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs4t7\" (UniqueName: \"kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.134806 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.134894 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.154288 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs4t7\" (UniqueName: \"kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7\") pod \"redhat-marketplace-pkkjj\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.223495 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:34:58 crc kubenswrapper[4759]: I1205 01:34:58.774512 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:34:59 crc kubenswrapper[4759]: I1205 01:34:59.075750 4759 generic.go:334] "Generic (PLEG): container finished" podID="4b299381-4904-4e5f-a678-87d92eb16279" containerID="da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b" exitCode=0 Dec 05 01:34:59 crc kubenswrapper[4759]: I1205 01:34:59.075898 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerDied","Data":"da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b"} Dec 05 01:34:59 crc kubenswrapper[4759]: I1205 01:34:59.076104 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerStarted","Data":"3810b24d4ed9d0a10e83f227f2c72c699ee04a19858990e0683108f195c6153a"} Dec 05 01:35:00 crc kubenswrapper[4759]: I1205 01:35:00.088439 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerStarted","Data":"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c"} Dec 05 01:35:01 crc kubenswrapper[4759]: I1205 01:35:01.104360 4759 generic.go:334] "Generic (PLEG): container finished" podID="4b299381-4904-4e5f-a678-87d92eb16279" containerID="e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c" exitCode=0 Dec 05 01:35:01 crc kubenswrapper[4759]: I1205 01:35:01.104410 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerDied","Data":"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c"} Dec 05 01:35:02 crc kubenswrapper[4759]: I1205 01:35:02.117043 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerStarted","Data":"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669"} Dec 05 01:35:02 crc kubenswrapper[4759]: I1205 01:35:02.144364 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pkkjj" podStartSLOduration=2.5674692500000003 podStartE2EDuration="5.144339554s" podCreationTimestamp="2025-12-05 01:34:57 +0000 UTC" firstStartedPulling="2025-12-05 01:34:59.077337668 +0000 UTC m=+4318.292998618" lastFinishedPulling="2025-12-05 01:35:01.654207972 +0000 UTC m=+4320.869868922" observedRunningTime="2025-12-05 01:35:02.137637484 +0000 UTC m=+4321.353298464" watchObservedRunningTime="2025-12-05 01:35:02.144339554 +0000 UTC m=+4321.360000504" Dec 05 01:35:08 crc kubenswrapper[4759]: I1205 01:35:08.200489 4759 generic.go:334] "Generic (PLEG): container finished" podID="3fac4138-c163-4a29-b1b0-b78285e908ec" containerID="2ed6f5d5410c4af0efe59ee839142b25c699041fc5adbf3bc37fb5975773c60c" exitCode=0 Dec 05 01:35:08 crc kubenswrapper[4759]: I1205 01:35:08.200660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" event={"ID":"3fac4138-c163-4a29-b1b0-b78285e908ec","Type":"ContainerDied","Data":"2ed6f5d5410c4af0efe59ee839142b25c699041fc5adbf3bc37fb5975773c60c"} Dec 05 01:35:08 crc kubenswrapper[4759]: I1205 01:35:08.223809 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:08 crc kubenswrapper[4759]: I1205 01:35:08.223861 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:08 crc kubenswrapper[4759]: I1205 01:35:08.589948 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.273291 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.329251 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.756371 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.896744 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.897166 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.897492 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7wzm\" (UniqueName: \"kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.897705 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.897851 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.897974 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.898120 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.898348 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.898541 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.898834 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.899073 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory\") pod \"3fac4138-c163-4a29-b1b0-b78285e908ec\" (UID: \"3fac4138-c163-4a29-b1b0-b78285e908ec\") " Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.903130 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.904905 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm" (OuterVolumeSpecName: "kube-api-access-k7wzm") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "kube-api-access-k7wzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.906882 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph" (OuterVolumeSpecName: "ceph") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.931278 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.932866 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.935884 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.950346 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.950794 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.954174 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.960478 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory" (OuterVolumeSpecName: "inventory") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:09 crc kubenswrapper[4759]: I1205 01:35:09.977146 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3fac4138-c163-4a29-b1b0-b78285e908ec" (UID: "3fac4138-c163-4a29-b1b0-b78285e908ec"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002738 4759 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002776 4759 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002790 4759 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002803 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002815 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002826 4759 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/3fac4138-c163-4a29-b1b0-b78285e908ec-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002836 4759 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002850 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7wzm\" (UniqueName: \"kubernetes.io/projected/3fac4138-c163-4a29-b1b0-b78285e908ec-kube-api-access-k7wzm\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002863 4759 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002874 4759 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.002885 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fac4138-c163-4a29-b1b0-b78285e908ec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.222034 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.223388 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j" event={"ID":"3fac4138-c163-4a29-b1b0-b78285e908ec","Type":"ContainerDied","Data":"9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1"} Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.223412 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd433a1c3e6d713dfdda56f89c5e5cba04e0132df4f2dc6e7ac33dc3c6e76d1" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.336664 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6"] Dec 05 01:35:10 crc kubenswrapper[4759]: E1205 01:35:10.337154 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fac4138-c163-4a29-b1b0-b78285e908ec" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.337176 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fac4138-c163-4a29-b1b0-b78285e908ec" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.337389 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fac4138-c163-4a29-b1b0-b78285e908ec" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.338133 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.340170 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.340503 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.340743 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.341137 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.341622 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.342460 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.358575 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6"] Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.513674 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.513834 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.513883 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.513929 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.514166 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.514440 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.514495 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.514534 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zjpz\" (UniqueName: \"kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.616943 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617193 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617236 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617320 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617402 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617426 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617444 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zjpz\" (UniqueName: \"kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.617484 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.622138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.623024 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.623498 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.625340 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.626991 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.628619 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.630485 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.640196 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zjpz\" (UniqueName: \"kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:10 crc kubenswrapper[4759]: I1205 01:35:10.674058 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.170157 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:35:11 crc kubenswrapper[4759]: E1205 01:35:11.173325 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.232324 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pkkjj" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="registry-server" containerID="cri-o://526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669" gracePeriod=2 Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.272877 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6"] Dec 05 01:35:11 crc kubenswrapper[4759]: W1205 01:35:11.300801 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74529ffd_281e_4f93_b8a1_fc858a1369c4.slice/crio-cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439 WatchSource:0}: Error finding container cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439: Status 404 returned error can't find the container with id cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439 Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.806634 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.970067 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities\") pod \"4b299381-4904-4e5f-a678-87d92eb16279\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.970151 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content\") pod \"4b299381-4904-4e5f-a678-87d92eb16279\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.970241 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs4t7\" (UniqueName: \"kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7\") pod \"4b299381-4904-4e5f-a678-87d92eb16279\" (UID: \"4b299381-4904-4e5f-a678-87d92eb16279\") " Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.972354 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities" (OuterVolumeSpecName: "utilities") pod "4b299381-4904-4e5f-a678-87d92eb16279" (UID: "4b299381-4904-4e5f-a678-87d92eb16279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.974926 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7" (OuterVolumeSpecName: "kube-api-access-fs4t7") pod "4b299381-4904-4e5f-a678-87d92eb16279" (UID: "4b299381-4904-4e5f-a678-87d92eb16279"). InnerVolumeSpecName "kube-api-access-fs4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:35:11 crc kubenswrapper[4759]: I1205 01:35:11.996864 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b299381-4904-4e5f-a678-87d92eb16279" (UID: "4b299381-4904-4e5f-a678-87d92eb16279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.074374 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.074655 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b299381-4904-4e5f-a678-87d92eb16279-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.074793 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs4t7\" (UniqueName: \"kubernetes.io/projected/4b299381-4904-4e5f-a678-87d92eb16279-kube-api-access-fs4t7\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.257844 4759 generic.go:334] "Generic (PLEG): container finished" podID="4b299381-4904-4e5f-a678-87d92eb16279" containerID="526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669" exitCode=0 Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.258095 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pkkjj" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.258113 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerDied","Data":"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669"} Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.258144 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pkkjj" event={"ID":"4b299381-4904-4e5f-a678-87d92eb16279","Type":"ContainerDied","Data":"3810b24d4ed9d0a10e83f227f2c72c699ee04a19858990e0683108f195c6153a"} Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.258162 4759 scope.go:117] "RemoveContainer" containerID="526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.268841 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" event={"ID":"74529ffd-281e-4f93-b8a1-fc858a1369c4","Type":"ContainerStarted","Data":"e6c1b37c1bdf97c7d0fbc44aa1e7697e12a6cf5c175d76b16712ea5dbc11826d"} Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.269194 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" event={"ID":"74529ffd-281e-4f93-b8a1-fc858a1369c4","Type":"ContainerStarted","Data":"cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439"} Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.296637 4759 scope.go:117] "RemoveContainer" containerID="e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.319375 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" podStartSLOduration=1.729061508 podStartE2EDuration="2.319354658s" podCreationTimestamp="2025-12-05 01:35:10 +0000 UTC" firstStartedPulling="2025-12-05 01:35:11.309442511 +0000 UTC m=+4330.525103461" lastFinishedPulling="2025-12-05 01:35:11.899735661 +0000 UTC m=+4331.115396611" observedRunningTime="2025-12-05 01:35:12.29487188 +0000 UTC m=+4331.510532840" watchObservedRunningTime="2025-12-05 01:35:12.319354658 +0000 UTC m=+4331.535015608" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.329433 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.330006 4759 scope.go:117] "RemoveContainer" containerID="da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.342064 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pkkjj"] Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.349915 4759 scope.go:117] "RemoveContainer" containerID="526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669" Dec 05 01:35:12 crc kubenswrapper[4759]: E1205 01:35:12.350398 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669\": container with ID starting with 526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669 not found: ID does not exist" containerID="526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.350458 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669"} err="failed to get container status \"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669\": rpc error: code = NotFound desc = could not find container \"526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669\": container with ID starting with 526de9d6176d3d3c753892ae20675675f98340c7e17048df6785ee8d2153a669 not found: ID does not exist" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.350489 4759 scope.go:117] "RemoveContainer" containerID="e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c" Dec 05 01:35:12 crc kubenswrapper[4759]: E1205 01:35:12.350820 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c\": container with ID starting with e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c not found: ID does not exist" containerID="e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.350853 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c"} err="failed to get container status \"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c\": rpc error: code = NotFound desc = could not find container \"e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c\": container with ID starting with e6c97b3f370d1dfae9c461f36fc5b49284377c1365aa8f6faabbffc6c1bd1a2c not found: ID does not exist" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.350880 4759 scope.go:117] "RemoveContainer" containerID="da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b" Dec 05 01:35:12 crc kubenswrapper[4759]: E1205 01:35:12.351220 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b\": container with ID starting with da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b not found: ID does not exist" containerID="da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b" Dec 05 01:35:12 crc kubenswrapper[4759]: I1205 01:35:12.351252 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b"} err="failed to get container status \"da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b\": rpc error: code = NotFound desc = could not find container \"da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b\": container with ID starting with da05e20596533a607e862f314c08cd4cab00d26022be7be02cabd571f705e18b not found: ID does not exist" Dec 05 01:35:13 crc kubenswrapper[4759]: I1205 01:35:13.182248 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b299381-4904-4e5f-a678-87d92eb16279" path="/var/lib/kubelet/pods/4b299381-4904-4e5f-a678-87d92eb16279/volumes" Dec 05 01:35:26 crc kubenswrapper[4759]: I1205 01:35:26.156854 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:35:26 crc kubenswrapper[4759]: E1205 01:35:26.157623 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:35:38 crc kubenswrapper[4759]: I1205 01:35:38.156283 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:35:38 crc kubenswrapper[4759]: E1205 01:35:38.157701 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.321684 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:44 crc kubenswrapper[4759]: E1205 01:35:44.322913 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="registry-server" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.322939 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="registry-server" Dec 05 01:35:44 crc kubenswrapper[4759]: E1205 01:35:44.322966 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="extract-utilities" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.322978 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="extract-utilities" Dec 05 01:35:44 crc kubenswrapper[4759]: E1205 01:35:44.323014 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="extract-content" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.323027 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="extract-content" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.324970 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b299381-4904-4e5f-a678-87d92eb16279" containerName="registry-server" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.329479 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.371482 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.423907 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbn5\" (UniqueName: \"kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.423981 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.424074 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.526023 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbn5\" (UniqueName: \"kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.526082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.526131 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.526658 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.526712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.555961 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbn5\" (UniqueName: \"kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5\") pod \"community-operators-rb527\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:44 crc kubenswrapper[4759]: I1205 01:35:44.673965 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:45 crc kubenswrapper[4759]: I1205 01:35:45.265050 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:45 crc kubenswrapper[4759]: W1205 01:35:45.266031 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574b3f24_83a7_4087_a67a_b17ece37b6a6.slice/crio-e11de0ef368941b0edea0bdd0cc086d16c9c4f3b9ccb6e05f52a995b0f07f8b1 WatchSource:0}: Error finding container e11de0ef368941b0edea0bdd0cc086d16c9c4f3b9ccb6e05f52a995b0f07f8b1: Status 404 returned error can't find the container with id e11de0ef368941b0edea0bdd0cc086d16c9c4f3b9ccb6e05f52a995b0f07f8b1 Dec 05 01:35:45 crc kubenswrapper[4759]: I1205 01:35:45.689744 4759 generic.go:334] "Generic (PLEG): container finished" podID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerID="421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3" exitCode=0 Dec 05 01:35:45 crc kubenswrapper[4759]: I1205 01:35:45.689843 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerDied","Data":"421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3"} Dec 05 01:35:45 crc kubenswrapper[4759]: I1205 01:35:45.690450 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerStarted","Data":"e11de0ef368941b0edea0bdd0cc086d16c9c4f3b9ccb6e05f52a995b0f07f8b1"} Dec 05 01:35:46 crc kubenswrapper[4759]: I1205 01:35:46.701654 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerStarted","Data":"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5"} Dec 05 01:35:47 crc kubenswrapper[4759]: I1205 01:35:47.740447 4759 generic.go:334] "Generic (PLEG): container finished" podID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerID="9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5" exitCode=0 Dec 05 01:35:47 crc kubenswrapper[4759]: I1205 01:35:47.740520 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerDied","Data":"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5"} Dec 05 01:35:47 crc kubenswrapper[4759]: I1205 01:35:47.744249 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:35:48 crc kubenswrapper[4759]: I1205 01:35:48.752319 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerStarted","Data":"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6"} Dec 05 01:35:48 crc kubenswrapper[4759]: I1205 01:35:48.785654 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rb527" podStartSLOduration=2.351145144 podStartE2EDuration="4.785631776s" podCreationTimestamp="2025-12-05 01:35:44 +0000 UTC" firstStartedPulling="2025-12-05 01:35:45.692775648 +0000 UTC m=+4364.908436598" lastFinishedPulling="2025-12-05 01:35:48.12726225 +0000 UTC m=+4367.342923230" observedRunningTime="2025-12-05 01:35:48.773969606 +0000 UTC m=+4367.989630556" watchObservedRunningTime="2025-12-05 01:35:48.785631776 +0000 UTC m=+4368.001292716" Dec 05 01:35:53 crc kubenswrapper[4759]: I1205 01:35:53.155909 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:35:53 crc kubenswrapper[4759]: E1205 01:35:53.156756 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:35:54 crc kubenswrapper[4759]: I1205 01:35:54.675176 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:54 crc kubenswrapper[4759]: I1205 01:35:54.675533 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:54 crc kubenswrapper[4759]: I1205 01:35:54.729978 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:54 crc kubenswrapper[4759]: I1205 01:35:54.864852 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:54 crc kubenswrapper[4759]: I1205 01:35:54.971796 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:56 crc kubenswrapper[4759]: I1205 01:35:56.833530 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rb527" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="registry-server" containerID="cri-o://2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6" gracePeriod=2 Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.280268 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.403905 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dbn5\" (UniqueName: \"kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5\") pod \"574b3f24-83a7-4087-a67a-b17ece37b6a6\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.404047 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content\") pod \"574b3f24-83a7-4087-a67a-b17ece37b6a6\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.404109 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities\") pod \"574b3f24-83a7-4087-a67a-b17ece37b6a6\" (UID: \"574b3f24-83a7-4087-a67a-b17ece37b6a6\") " Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.405088 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities" (OuterVolumeSpecName: "utilities") pod "574b3f24-83a7-4087-a67a-b17ece37b6a6" (UID: "574b3f24-83a7-4087-a67a-b17ece37b6a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.411880 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5" (OuterVolumeSpecName: "kube-api-access-2dbn5") pod "574b3f24-83a7-4087-a67a-b17ece37b6a6" (UID: "574b3f24-83a7-4087-a67a-b17ece37b6a6"). InnerVolumeSpecName "kube-api-access-2dbn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.453123 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "574b3f24-83a7-4087-a67a-b17ece37b6a6" (UID: "574b3f24-83a7-4087-a67a-b17ece37b6a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.525131 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dbn5\" (UniqueName: \"kubernetes.io/projected/574b3f24-83a7-4087-a67a-b17ece37b6a6-kube-api-access-2dbn5\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.525515 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.525540 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574b3f24-83a7-4087-a67a-b17ece37b6a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.853782 4759 generic.go:334] "Generic (PLEG): container finished" podID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerID="2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6" exitCode=0 Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.853827 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerDied","Data":"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6"} Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.853880 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rb527" event={"ID":"574b3f24-83a7-4087-a67a-b17ece37b6a6","Type":"ContainerDied","Data":"e11de0ef368941b0edea0bdd0cc086d16c9c4f3b9ccb6e05f52a995b0f07f8b1"} Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.853907 4759 scope.go:117] "RemoveContainer" containerID="2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.853915 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rb527" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.910996 4759 scope.go:117] "RemoveContainer" containerID="9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5" Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.927888 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.945827 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rb527"] Dec 05 01:35:57 crc kubenswrapper[4759]: I1205 01:35:57.953213 4759 scope.go:117] "RemoveContainer" containerID="421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.010384 4759 scope.go:117] "RemoveContainer" containerID="2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6" Dec 05 01:35:58 crc kubenswrapper[4759]: E1205 01:35:58.010888 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6\": container with ID starting with 2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6 not found: ID does not exist" containerID="2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.010945 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6"} err="failed to get container status \"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6\": rpc error: code = NotFound desc = could not find container \"2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6\": container with ID starting with 2c55f2faa189540157d6f087f053c4f65fa68919bc78e4928ae97228a49cb4e6 not found: ID does not exist" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.010974 4759 scope.go:117] "RemoveContainer" containerID="9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5" Dec 05 01:35:58 crc kubenswrapper[4759]: E1205 01:35:58.011564 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5\": container with ID starting with 9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5 not found: ID does not exist" containerID="9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.011606 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5"} err="failed to get container status \"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5\": rpc error: code = NotFound desc = could not find container \"9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5\": container with ID starting with 9c18d89ad3b2abe939ada21097c647877148023e1605c145e697300d1961e4e5 not found: ID does not exist" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.011630 4759 scope.go:117] "RemoveContainer" containerID="421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3" Dec 05 01:35:58 crc kubenswrapper[4759]: E1205 01:35:58.011953 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3\": container with ID starting with 421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3 not found: ID does not exist" containerID="421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3" Dec 05 01:35:58 crc kubenswrapper[4759]: I1205 01:35:58.011992 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3"} err="failed to get container status \"421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3\": rpc error: code = NotFound desc = could not find container \"421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3\": container with ID starting with 421fa4f2c4cd54a6ec66e5e68c0b09ea89367dc8a32089a2b7cd5d429e021bd3 not found: ID does not exist" Dec 05 01:35:59 crc kubenswrapper[4759]: I1205 01:35:59.169987 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" path="/var/lib/kubelet/pods/574b3f24-83a7-4087-a67a-b17ece37b6a6/volumes" Dec 05 01:36:06 crc kubenswrapper[4759]: I1205 01:36:06.156275 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:36:06 crc kubenswrapper[4759]: E1205 01:36:06.157332 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:36:19 crc kubenswrapper[4759]: I1205 01:36:19.157353 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:36:19 crc kubenswrapper[4759]: E1205 01:36:19.158848 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:36:30 crc kubenswrapper[4759]: I1205 01:36:30.157522 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:36:30 crc kubenswrapper[4759]: E1205 01:36:30.158494 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:36:42 crc kubenswrapper[4759]: I1205 01:36:42.156123 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:36:42 crc kubenswrapper[4759]: E1205 01:36:42.156915 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:36:55 crc kubenswrapper[4759]: I1205 01:36:55.156113 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:36:55 crc kubenswrapper[4759]: E1205 01:36:55.157242 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:37:09 crc kubenswrapper[4759]: I1205 01:37:09.157375 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:37:09 crc kubenswrapper[4759]: E1205 01:37:09.158537 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:37:21 crc kubenswrapper[4759]: I1205 01:37:21.164451 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:37:21 crc kubenswrapper[4759]: E1205 01:37:21.165544 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:37:35 crc kubenswrapper[4759]: I1205 01:37:35.156211 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:37:35 crc kubenswrapper[4759]: E1205 01:37:35.157100 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:37:46 crc kubenswrapper[4759]: I1205 01:37:46.155472 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:37:46 crc kubenswrapper[4759]: E1205 01:37:46.156273 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:37:58 crc kubenswrapper[4759]: I1205 01:37:58.156337 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:37:58 crc kubenswrapper[4759]: E1205 01:37:58.157389 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:38:11 crc kubenswrapper[4759]: I1205 01:38:11.164157 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:38:11 crc kubenswrapper[4759]: E1205 01:38:11.165034 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:38:22 crc kubenswrapper[4759]: I1205 01:38:22.156101 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:38:22 crc kubenswrapper[4759]: E1205 01:38:22.157260 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.246176 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:32 crc kubenswrapper[4759]: E1205 01:38:32.247274 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="extract-utilities" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.247289 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="extract-utilities" Dec 05 01:38:32 crc kubenswrapper[4759]: E1205 01:38:32.247344 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="extract-content" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.247355 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="extract-content" Dec 05 01:38:32 crc kubenswrapper[4759]: E1205 01:38:32.247373 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="registry-server" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.247381 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="registry-server" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.247641 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="574b3f24-83a7-4087-a67a-b17ece37b6a6" containerName="registry-server" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.249601 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.270010 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.414966 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.415025 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdxh\" (UniqueName: \"kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.415052 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.516637 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdxh\" (UniqueName: \"kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.516703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.516927 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.517290 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.517355 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.651173 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdxh\" (UniqueName: \"kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh\") pod \"redhat-operators-qnmzh\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:32 crc kubenswrapper[4759]: I1205 01:38:32.892578 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:33 crc kubenswrapper[4759]: I1205 01:38:33.432431 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:33 crc kubenswrapper[4759]: I1205 01:38:33.746474 4759 generic.go:334] "Generic (PLEG): container finished" podID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerID="43ba54ebb44c5ed24b81c6ddd86dbee6c4764e8385b8167661c21072dfb71390" exitCode=0 Dec 05 01:38:33 crc kubenswrapper[4759]: I1205 01:38:33.746569 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerDied","Data":"43ba54ebb44c5ed24b81c6ddd86dbee6c4764e8385b8167661c21072dfb71390"} Dec 05 01:38:33 crc kubenswrapper[4759]: I1205 01:38:33.746994 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerStarted","Data":"26f8a83d1b1dfca2f0c2d5d6b48a35778c888a397dd51f1ecfb43ecec488e2c6"} Dec 05 01:38:34 crc kubenswrapper[4759]: I1205 01:38:34.155975 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:38:34 crc kubenswrapper[4759]: E1205 01:38:34.156243 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:38:34 crc kubenswrapper[4759]: I1205 01:38:34.761487 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerStarted","Data":"092578e7ad4ad5eb416ed9c20bd271ce92bda3147e7f361d144570d8bb467fd3"} Dec 05 01:38:37 crc kubenswrapper[4759]: I1205 01:38:37.797225 4759 generic.go:334] "Generic (PLEG): container finished" podID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerID="092578e7ad4ad5eb416ed9c20bd271ce92bda3147e7f361d144570d8bb467fd3" exitCode=0 Dec 05 01:38:37 crc kubenswrapper[4759]: I1205 01:38:37.797699 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerDied","Data":"092578e7ad4ad5eb416ed9c20bd271ce92bda3147e7f361d144570d8bb467fd3"} Dec 05 01:38:38 crc kubenswrapper[4759]: I1205 01:38:38.815012 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerStarted","Data":"caea40e07998612ce22a9fbc5c55c5b550e7757c6e86caecd65f39bc375bc29b"} Dec 05 01:38:38 crc kubenswrapper[4759]: I1205 01:38:38.840708 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnmzh" podStartSLOduration=2.15234638 podStartE2EDuration="6.840676248s" podCreationTimestamp="2025-12-05 01:38:32 +0000 UTC" firstStartedPulling="2025-12-05 01:38:33.748435135 +0000 UTC m=+4532.964096085" lastFinishedPulling="2025-12-05 01:38:38.436764993 +0000 UTC m=+4537.652425953" observedRunningTime="2025-12-05 01:38:38.833807772 +0000 UTC m=+4538.049468762" watchObservedRunningTime="2025-12-05 01:38:38.840676248 +0000 UTC m=+4538.056337198" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.282113 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.285543 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.297080 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.427293 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dn7\" (UniqueName: \"kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.427363 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.427405 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.529684 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dn7\" (UniqueName: \"kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.529733 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.529778 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.530405 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.530446 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.652128 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dn7\" (UniqueName: \"kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7\") pod \"certified-operators-mbr9k\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:41 crc kubenswrapper[4759]: I1205 01:38:41.910634 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:42 crc kubenswrapper[4759]: I1205 01:38:42.429872 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:38:42 crc kubenswrapper[4759]: I1205 01:38:42.857152 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerStarted","Data":"c5af12a34f4d5224ab4181b8b51dafd9b83179fdf6c218489104c35556cc14b8"} Dec 05 01:38:42 crc kubenswrapper[4759]: I1205 01:38:42.893717 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:42 crc kubenswrapper[4759]: I1205 01:38:42.893774 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:43 crc kubenswrapper[4759]: I1205 01:38:43.869865 4759 generic.go:334] "Generic (PLEG): container finished" podID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerID="3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9" exitCode=0 Dec 05 01:38:43 crc kubenswrapper[4759]: I1205 01:38:43.869922 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerDied","Data":"3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9"} Dec 05 01:38:43 crc kubenswrapper[4759]: I1205 01:38:43.949573 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qnmzh" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="registry-server" probeResult="failure" output=< Dec 05 01:38:43 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:38:43 crc kubenswrapper[4759]: > Dec 05 01:38:45 crc kubenswrapper[4759]: I1205 01:38:45.891606 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerStarted","Data":"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187"} Dec 05 01:38:46 crc kubenswrapper[4759]: I1205 01:38:46.907976 4759 generic.go:334] "Generic (PLEG): container finished" podID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerID="39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187" exitCode=0 Dec 05 01:38:46 crc kubenswrapper[4759]: I1205 01:38:46.908094 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerDied","Data":"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187"} Dec 05 01:38:46 crc kubenswrapper[4759]: I1205 01:38:46.911812 4759 generic.go:334] "Generic (PLEG): container finished" podID="74529ffd-281e-4f93-b8a1-fc858a1369c4" containerID="e6c1b37c1bdf97c7d0fbc44aa1e7697e12a6cf5c175d76b16712ea5dbc11826d" exitCode=0 Dec 05 01:38:46 crc kubenswrapper[4759]: I1205 01:38:46.911899 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" event={"ID":"74529ffd-281e-4f93-b8a1-fc858a1369c4","Type":"ContainerDied","Data":"e6c1b37c1bdf97c7d0fbc44aa1e7697e12a6cf5c175d76b16712ea5dbc11826d"} Dec 05 01:38:47 crc kubenswrapper[4759]: I1205 01:38:47.155979 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:38:47 crc kubenswrapper[4759]: E1205 01:38:47.156686 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:38:47 crc kubenswrapper[4759]: I1205 01:38:47.926380 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerStarted","Data":"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04"} Dec 05 01:38:47 crc kubenswrapper[4759]: I1205 01:38:47.958782 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mbr9k" podStartSLOduration=3.492852001 podStartE2EDuration="6.958748778s" podCreationTimestamp="2025-12-05 01:38:41 +0000 UTC" firstStartedPulling="2025-12-05 01:38:43.872344541 +0000 UTC m=+4543.088005501" lastFinishedPulling="2025-12-05 01:38:47.338241318 +0000 UTC m=+4546.553902278" observedRunningTime="2025-12-05 01:38:47.955859578 +0000 UTC m=+4547.171520538" watchObservedRunningTime="2025-12-05 01:38:47.958748778 +0000 UTC m=+4547.174409768" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.442233 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585051 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585197 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585292 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585370 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zjpz\" (UniqueName: \"kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585417 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585536 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585612 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.585671 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1\") pod \"74529ffd-281e-4f93-b8a1-fc858a1369c4\" (UID: \"74529ffd-281e-4f93-b8a1-fc858a1369c4\") " Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.590787 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.593020 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz" (OuterVolumeSpecName: "kube-api-access-7zjpz") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "kube-api-access-7zjpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.593844 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph" (OuterVolumeSpecName: "ceph") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.626332 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.626663 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.628807 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.633502 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory" (OuterVolumeSpecName: "inventory") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.646244 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "74529ffd-281e-4f93-b8a1-fc858a1369c4" (UID: "74529ffd-281e-4f93-b8a1-fc858a1369c4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688894 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688936 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688952 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688967 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688979 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.688996 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zjpz\" (UniqueName: \"kubernetes.io/projected/74529ffd-281e-4f93-b8a1-fc858a1369c4-kube-api-access-7zjpz\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.689008 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.689024 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74529ffd-281e-4f93-b8a1-fc858a1369c4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.940876 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" event={"ID":"74529ffd-281e-4f93-b8a1-fc858a1369c4","Type":"ContainerDied","Data":"cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439"} Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.940948 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd75ffd24f268496914874df7054a82482fd45018de311a0ef0be60fc3a2d439" Dec 05 01:38:48 crc kubenswrapper[4759]: I1205 01:38:48.942361 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.277021 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92"] Dec 05 01:38:49 crc kubenswrapper[4759]: E1205 01:38:49.277583 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74529ffd-281e-4f93-b8a1-fc858a1369c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.277606 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="74529ffd-281e-4f93-b8a1-fc858a1369c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.277930 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="74529ffd-281e-4f93-b8a1-fc858a1369c4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.278839 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.287548 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92"] Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.292192 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.292411 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.292470 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.292788 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.293471 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.294232 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.403998 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404057 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404139 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404159 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404210 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czxl\" (UniqueName: \"kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.404322 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506533 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506586 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506667 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506702 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506739 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506833 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506862 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.506927 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czxl\" (UniqueName: \"kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.511856 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.511930 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.512435 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.512512 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.513383 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.514162 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.525087 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.530731 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czxl\" (UniqueName: \"kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:49 crc kubenswrapper[4759]: I1205 01:38:49.607137 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:38:50 crc kubenswrapper[4759]: W1205 01:38:50.150764 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef0b2002_5521_4629_8083_fd25b382c0db.slice/crio-6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77 WatchSource:0}: Error finding container 6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77: Status 404 returned error can't find the container with id 6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77 Dec 05 01:38:50 crc kubenswrapper[4759]: I1205 01:38:50.168995 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92"] Dec 05 01:38:50 crc kubenswrapper[4759]: I1205 01:38:50.963879 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" event={"ID":"ef0b2002-5521-4629-8083-fd25b382c0db","Type":"ContainerStarted","Data":"6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77"} Dec 05 01:38:51 crc kubenswrapper[4759]: I1205 01:38:51.911628 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:51 crc kubenswrapper[4759]: I1205 01:38:51.912056 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:51 crc kubenswrapper[4759]: I1205 01:38:51.978535 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" event={"ID":"ef0b2002-5521-4629-8083-fd25b382c0db","Type":"ContainerStarted","Data":"2f9378d7f2641623a1651f041dad44815a3b2f3c6d19b083d6ce9b6cf893a0af"} Dec 05 01:38:51 crc kubenswrapper[4759]: I1205 01:38:51.988579 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:38:51 crc kubenswrapper[4759]: I1205 01:38:51.999841 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" podStartSLOduration=2.469507217 podStartE2EDuration="2.999805001s" podCreationTimestamp="2025-12-05 01:38:49 +0000 UTC" firstStartedPulling="2025-12-05 01:38:50.152503964 +0000 UTC m=+4549.368164914" lastFinishedPulling="2025-12-05 01:38:50.682801708 +0000 UTC m=+4549.898462698" observedRunningTime="2025-12-05 01:38:51.9968646 +0000 UTC m=+4551.212525560" watchObservedRunningTime="2025-12-05 01:38:51.999805001 +0000 UTC m=+4551.215465951" Dec 05 01:38:52 crc kubenswrapper[4759]: I1205 01:38:52.957907 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:53 crc kubenswrapper[4759]: I1205 01:38:53.009868 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:53 crc kubenswrapper[4759]: I1205 01:38:53.242175 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:54 crc kubenswrapper[4759]: I1205 01:38:54.017356 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnmzh" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="registry-server" containerID="cri-o://caea40e07998612ce22a9fbc5c55c5b550e7757c6e86caecd65f39bc375bc29b" gracePeriod=2 Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.044978 4759 generic.go:334] "Generic (PLEG): container finished" podID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerID="caea40e07998612ce22a9fbc5c55c5b550e7757c6e86caecd65f39bc375bc29b" exitCode=0 Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.045160 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerDied","Data":"caea40e07998612ce22a9fbc5c55c5b550e7757c6e86caecd65f39bc375bc29b"} Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.172217 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.228547 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shdxh\" (UniqueName: \"kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh\") pod \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.228685 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities\") pod \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.228801 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content\") pod \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\" (UID: \"7bdc985b-28cd-40f6-b6c0-03b2fda15b85\") " Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.229785 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities" (OuterVolumeSpecName: "utilities") pod "7bdc985b-28cd-40f6-b6c0-03b2fda15b85" (UID: "7bdc985b-28cd-40f6-b6c0-03b2fda15b85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.238510 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh" (OuterVolumeSpecName: "kube-api-access-shdxh") pod "7bdc985b-28cd-40f6-b6c0-03b2fda15b85" (UID: "7bdc985b-28cd-40f6-b6c0-03b2fda15b85"). InnerVolumeSpecName "kube-api-access-shdxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.331218 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shdxh\" (UniqueName: \"kubernetes.io/projected/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-kube-api-access-shdxh\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.331255 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.334513 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bdc985b-28cd-40f6-b6c0-03b2fda15b85" (UID: "7bdc985b-28cd-40f6-b6c0-03b2fda15b85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:38:55 crc kubenswrapper[4759]: I1205 01:38:55.433420 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bdc985b-28cd-40f6-b6c0-03b2fda15b85-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.061944 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmzh" event={"ID":"7bdc985b-28cd-40f6-b6c0-03b2fda15b85","Type":"ContainerDied","Data":"26f8a83d1b1dfca2f0c2d5d6b48a35778c888a397dd51f1ecfb43ecec488e2c6"} Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.062024 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmzh" Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.062342 4759 scope.go:117] "RemoveContainer" containerID="caea40e07998612ce22a9fbc5c55c5b550e7757c6e86caecd65f39bc375bc29b" Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.092111 4759 scope.go:117] "RemoveContainer" containerID="092578e7ad4ad5eb416ed9c20bd271ce92bda3147e7f361d144570d8bb467fd3" Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.124476 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.152936 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnmzh"] Dec 05 01:38:56 crc kubenswrapper[4759]: I1205 01:38:56.164796 4759 scope.go:117] "RemoveContainer" containerID="43ba54ebb44c5ed24b81c6ddd86dbee6c4764e8385b8167661c21072dfb71390" Dec 05 01:38:57 crc kubenswrapper[4759]: I1205 01:38:57.169381 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" path="/var/lib/kubelet/pods/7bdc985b-28cd-40f6-b6c0-03b2fda15b85/volumes" Dec 05 01:39:01 crc kubenswrapper[4759]: I1205 01:39:01.983027 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:39:02 crc kubenswrapper[4759]: I1205 01:39:02.038252 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:39:02 crc kubenswrapper[4759]: I1205 01:39:02.130074 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mbr9k" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="registry-server" containerID="cri-o://9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04" gracePeriod=2 Dec 05 01:39:02 crc kubenswrapper[4759]: I1205 01:39:02.159073 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:39:02 crc kubenswrapper[4759]: E1205 01:39:02.159279 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:39:02 crc kubenswrapper[4759]: I1205 01:39:02.833106 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.018791 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities\") pod \"d117f3c7-6d75-485c-9fcf-7072abc70551\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.019632 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities" (OuterVolumeSpecName: "utilities") pod "d117f3c7-6d75-485c-9fcf-7072abc70551" (UID: "d117f3c7-6d75-485c-9fcf-7072abc70551"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.019955 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6dn7\" (UniqueName: \"kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7\") pod \"d117f3c7-6d75-485c-9fcf-7072abc70551\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.020084 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content\") pod \"d117f3c7-6d75-485c-9fcf-7072abc70551\" (UID: \"d117f3c7-6d75-485c-9fcf-7072abc70551\") " Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.022332 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.027387 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7" (OuterVolumeSpecName: "kube-api-access-x6dn7") pod "d117f3c7-6d75-485c-9fcf-7072abc70551" (UID: "d117f3c7-6d75-485c-9fcf-7072abc70551"). InnerVolumeSpecName "kube-api-access-x6dn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.086478 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d117f3c7-6d75-485c-9fcf-7072abc70551" (UID: "d117f3c7-6d75-485c-9fcf-7072abc70551"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.126137 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6dn7\" (UniqueName: \"kubernetes.io/projected/d117f3c7-6d75-485c-9fcf-7072abc70551-kube-api-access-x6dn7\") on node \"crc\" DevicePath \"\"" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.126171 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d117f3c7-6d75-485c-9fcf-7072abc70551-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.143931 4759 generic.go:334] "Generic (PLEG): container finished" podID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerID="9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04" exitCode=0 Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.143983 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerDied","Data":"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04"} Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.144007 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbr9k" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.144036 4759 scope.go:117] "RemoveContainer" containerID="9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.144020 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbr9k" event={"ID":"d117f3c7-6d75-485c-9fcf-7072abc70551","Type":"ContainerDied","Data":"c5af12a34f4d5224ab4181b8b51dafd9b83179fdf6c218489104c35556cc14b8"} Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.167756 4759 scope.go:117] "RemoveContainer" containerID="39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.212808 4759 scope.go:117] "RemoveContainer" containerID="3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.227773 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.240053 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mbr9k"] Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.253639 4759 scope.go:117] "RemoveContainer" containerID="9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04" Dec 05 01:39:03 crc kubenswrapper[4759]: E1205 01:39:03.255847 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04\": container with ID starting with 9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04 not found: ID does not exist" containerID="9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.255898 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04"} err="failed to get container status \"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04\": rpc error: code = NotFound desc = could not find container \"9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04\": container with ID starting with 9574c1daf6030a8bb4c6e400ecbdf70d1ff8703f5d589136c1377fd1d88dee04 not found: ID does not exist" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.255930 4759 scope.go:117] "RemoveContainer" containerID="39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187" Dec 05 01:39:03 crc kubenswrapper[4759]: E1205 01:39:03.256326 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187\": container with ID starting with 39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187 not found: ID does not exist" containerID="39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.256358 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187"} err="failed to get container status \"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187\": rpc error: code = NotFound desc = could not find container \"39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187\": container with ID starting with 39185252931bc0283fa143dc7aa87562c9bfb4df2fd99397d18eb579ce012187 not found: ID does not exist" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.256372 4759 scope.go:117] "RemoveContainer" containerID="3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9" Dec 05 01:39:03 crc kubenswrapper[4759]: E1205 01:39:03.256663 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9\": container with ID starting with 3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9 not found: ID does not exist" containerID="3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9" Dec 05 01:39:03 crc kubenswrapper[4759]: I1205 01:39:03.256713 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9"} err="failed to get container status \"3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9\": rpc error: code = NotFound desc = could not find container \"3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9\": container with ID starting with 3d3ab91f943a1760053c56f3dd8988706f185754225999c0e568407ea7208da9 not found: ID does not exist" Dec 05 01:39:05 crc kubenswrapper[4759]: I1205 01:39:05.174171 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" path="/var/lib/kubelet/pods/d117f3c7-6d75-485c-9fcf-7072abc70551/volumes" Dec 05 01:39:16 crc kubenswrapper[4759]: I1205 01:39:16.156354 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:39:17 crc kubenswrapper[4759]: I1205 01:39:17.338634 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7"} Dec 05 01:41:34 crc kubenswrapper[4759]: I1205 01:41:34.433138 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:41:34 crc kubenswrapper[4759]: I1205 01:41:34.433966 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:41:36 crc kubenswrapper[4759]: I1205 01:41:36.164253 4759 generic.go:334] "Generic (PLEG): container finished" podID="ef0b2002-5521-4629-8083-fd25b382c0db" containerID="2f9378d7f2641623a1651f041dad44815a3b2f3c6d19b083d6ce9b6cf893a0af" exitCode=0 Dec 05 01:41:36 crc kubenswrapper[4759]: I1205 01:41:36.164382 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" event={"ID":"ef0b2002-5521-4629-8083-fd25b382c0db","Type":"ContainerDied","Data":"2f9378d7f2641623a1651f041dad44815a3b2f3c6d19b083d6ce9b6cf893a0af"} Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.786843 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938172 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938289 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938388 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938585 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938655 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938676 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938706 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.938733 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8czxl\" (UniqueName: \"kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl\") pod \"ef0b2002-5521-4629-8083-fd25b382c0db\" (UID: \"ef0b2002-5521-4629-8083-fd25b382c0db\") " Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.946408 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.947080 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph" (OuterVolumeSpecName: "ceph") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.949215 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl" (OuterVolumeSpecName: "kube-api-access-8czxl") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "kube-api-access-8czxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.979559 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.981906 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.985154 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:37 crc kubenswrapper[4759]: I1205 01:41:37.986215 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.005293 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory" (OuterVolumeSpecName: "inventory") pod "ef0b2002-5521-4629-8083-fd25b382c0db" (UID: "ef0b2002-5521-4629-8083-fd25b382c0db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041249 4759 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041300 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041383 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041396 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041409 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041422 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8czxl\" (UniqueName: \"kubernetes.io/projected/ef0b2002-5521-4629-8083-fd25b382c0db-kube-api-access-8czxl\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041436 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.041449 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/ef0b2002-5521-4629-8083-fd25b382c0db-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.221144 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" event={"ID":"ef0b2002-5521-4629-8083-fd25b382c0db","Type":"ContainerDied","Data":"6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77"} Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.221215 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aba6de519ee0bd75a1af2d4727fd5efe26796331fe3c8830148caa5559daf77" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.221335 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.309595 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z"] Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312238 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="extract-utilities" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312261 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="extract-utilities" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312280 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0b2002-5521-4629-8083-fd25b382c0db" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312289 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0b2002-5521-4629-8083-fd25b382c0db" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312310 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312330 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312338 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312345 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312359 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="extract-utilities" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312365 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="extract-utilities" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312387 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="extract-content" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312393 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="extract-content" Dec 05 01:41:38 crc kubenswrapper[4759]: E1205 01:41:38.312408 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="extract-content" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312413 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="extract-content" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312614 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdc985b-28cd-40f6-b6c0-03b2fda15b85" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312629 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0b2002-5521-4629-8083-fd25b382c0db" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.312653 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d117f3c7-6d75-485c-9fcf-7072abc70551" containerName="registry-server" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.313381 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.317119 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.317406 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.319652 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.319675 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.319976 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.320792 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dz2xw" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.332586 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z"] Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451152 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451223 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451255 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4qs\" (UniqueName: \"kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451320 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451404 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.451425 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.553725 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.553817 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.553852 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4qs\" (UniqueName: \"kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.553925 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.554004 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.554022 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.558969 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.558989 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.559585 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.559629 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.567478 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.573021 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4qs\" (UniqueName: \"kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k248z\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:38 crc kubenswrapper[4759]: I1205 01:41:38.638896 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:39 crc kubenswrapper[4759]: I1205 01:41:39.255190 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z"] Dec 05 01:41:39 crc kubenswrapper[4759]: I1205 01:41:39.260090 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:41:40 crc kubenswrapper[4759]: I1205 01:41:40.247885 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" event={"ID":"3e40e39a-7038-4839-9104-6cf64842c4a7","Type":"ContainerStarted","Data":"ac77e105a0324384a7323cb2bc28684f6494d2d93e84f774a326cd3e78894c13"} Dec 05 01:41:40 crc kubenswrapper[4759]: I1205 01:41:40.248770 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" event={"ID":"3e40e39a-7038-4839-9104-6cf64842c4a7","Type":"ContainerStarted","Data":"77ec92a9cee26de4bb22ced6a140aafd16dd3f51ccd80fcdb83678381efb58c3"} Dec 05 01:41:40 crc kubenswrapper[4759]: I1205 01:41:40.287956 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" podStartSLOduration=1.801834547 podStartE2EDuration="2.287925674s" podCreationTimestamp="2025-12-05 01:41:38 +0000 UTC" firstStartedPulling="2025-12-05 01:41:39.259725218 +0000 UTC m=+4718.475386198" lastFinishedPulling="2025-12-05 01:41:39.745816335 +0000 UTC m=+4718.961477325" observedRunningTime="2025-12-05 01:41:40.273848954 +0000 UTC m=+4719.489509944" watchObservedRunningTime="2025-12-05 01:41:40.287925674 +0000 UTC m=+4719.503586664" Dec 05 01:41:54 crc kubenswrapper[4759]: I1205 01:41:54.464916 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e40e39a-7038-4839-9104-6cf64842c4a7" containerID="ac77e105a0324384a7323cb2bc28684f6494d2d93e84f774a326cd3e78894c13" exitCode=0 Dec 05 01:41:54 crc kubenswrapper[4759]: I1205 01:41:54.465029 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" event={"ID":"3e40e39a-7038-4839-9104-6cf64842c4a7","Type":"ContainerDied","Data":"ac77e105a0324384a7323cb2bc28684f6494d2d93e84f774a326cd3e78894c13"} Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.017555 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203261 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203558 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203616 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203647 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203791 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.203831 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4qs\" (UniqueName: \"kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs\") pod \"3e40e39a-7038-4839-9104-6cf64842c4a7\" (UID: \"3e40e39a-7038-4839-9104-6cf64842c4a7\") " Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.210011 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs" (OuterVolumeSpecName: "kube-api-access-wz4qs") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "kube-api-access-wz4qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.210206 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph" (OuterVolumeSpecName: "ceph") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.242061 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.260910 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.264110 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory" (OuterVolumeSpecName: "inventory") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.272603 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e40e39a-7038-4839-9104-6cf64842c4a7" (UID: "3e40e39a-7038-4839-9104-6cf64842c4a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310809 4759 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310881 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4qs\" (UniqueName: \"kubernetes.io/projected/3e40e39a-7038-4839-9104-6cf64842c4a7-kube-api-access-wz4qs\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310912 4759 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310941 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310966 4759 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.310993 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e40e39a-7038-4839-9104-6cf64842c4a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.496162 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" event={"ID":"3e40e39a-7038-4839-9104-6cf64842c4a7","Type":"ContainerDied","Data":"77ec92a9cee26de4bb22ced6a140aafd16dd3f51ccd80fcdb83678381efb58c3"} Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.496208 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ec92a9cee26de4bb22ced6a140aafd16dd3f51ccd80fcdb83678381efb58c3" Dec 05 01:41:56 crc kubenswrapper[4759]: I1205 01:41:56.496254 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k248z" Dec 05 01:42:04 crc kubenswrapper[4759]: I1205 01:42:04.433866 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:42:04 crc kubenswrapper[4759]: I1205 01:42:04.434526 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.427371 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 05 01:42:12 crc kubenswrapper[4759]: E1205 01:42:12.428509 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e40e39a-7038-4839-9104-6cf64842c4a7" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.428529 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e40e39a-7038-4839-9104-6cf64842c4a7" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.428801 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e40e39a-7038-4839-9104-6cf64842c4a7" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.430517 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.432880 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.433644 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.448856 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.451108 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.453013 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.459848 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.481137 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521449 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521490 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521520 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521555 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521593 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521609 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521635 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521661 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-run\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521694 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjnd\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-kube-api-access-dnjnd\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521779 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521811 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.521933 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.522040 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.522123 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.522149 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.522165 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.623962 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624238 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-scripts\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624280 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624299 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624346 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624366 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-run\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624388 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-lib-modules\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624403 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624424 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-run\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624397 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624427 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624474 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624627 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624693 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624729 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjnd\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-kube-api-access-dnjnd\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624822 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624844 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624850 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624912 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-ceph\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.624963 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-sys\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625011 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8m82\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-kube-api-access-p8m82\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625044 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625088 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625268 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625321 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-dev\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625399 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625440 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625466 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625496 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625563 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625597 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-run\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625623 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625643 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625666 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625693 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625709 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625742 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625815 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625812 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.625976 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.626082 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dbf1c346-6958-4849-8773-9d7b42b2c6fd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.630574 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.630613 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.631187 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.631951 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.635060 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf1c346-6958-4849-8773-9d7b42b2c6fd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.644560 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjnd\" (UniqueName: \"kubernetes.io/projected/dbf1c346-6958-4849-8773-9d7b42b2c6fd-kube-api-access-dnjnd\") pod \"cinder-volume-volume1-0\" (UID: \"dbf1c346-6958-4849-8773-9d7b42b2c6fd\") " pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727702 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727765 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-run\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727790 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727826 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727852 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-scripts\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727892 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-lib-modules\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727909 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727927 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727946 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.727964 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728001 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-ceph\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728021 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-sys\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728043 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8m82\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-kube-api-access-p8m82\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728065 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-dev\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728093 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728112 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728637 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-lib-modules\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728704 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728745 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-run\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728782 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728799 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728818 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-sys\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728808 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728821 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728848 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.728867 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3772ce5b-f22d-4f9a-ad46-66923fae82be-dev\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.731054 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.731971 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.733080 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-ceph\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.733127 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-config-data\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.733249 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3772ce5b-f22d-4f9a-ad46-66923fae82be-scripts\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.743712 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8m82\" (UniqueName: \"kubernetes.io/projected/3772ce5b-f22d-4f9a-ad46-66923fae82be-kube-api-access-p8m82\") pod \"cinder-backup-0\" (UID: \"3772ce5b-f22d-4f9a-ad46-66923fae82be\") " pod="openstack/cinder-backup-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.751834 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:12 crc kubenswrapper[4759]: I1205 01:42:12.768107 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.112050 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-4rlm4"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.113695 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.127082 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4rlm4"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.232396 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-fa0a-account-create-update-x559s"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.233704 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.237114 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.241795 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8mq\" (UniqueName: \"kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.241904 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.259575 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa0a-account-create-update-x559s"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.275415 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.277275 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.283296 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.283532 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.283637 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.283721 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8cs9x" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.313773 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.343558 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.343711 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.343752 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8mq\" (UniqueName: \"kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.343787 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkhl\" (UniqueName: \"kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.344568 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.369507 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8mq\" (UniqueName: \"kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq\") pod \"manila-db-create-4rlm4\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445229 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445365 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445414 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445441 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445467 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445512 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzzm\" (UniqueName: \"kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.445536 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkhl\" (UniqueName: \"kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.446011 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.446529 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.448026 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.449751 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.482868 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7zchw" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.483096 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.483297 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.483436 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.484762 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.491114 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkhl\" (UniqueName: \"kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl\") pod \"manila-fa0a-account-create-update-x559s\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.532953 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.534710 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.536952 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.542734 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.544183 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/glance-default-internal-api-0" oldPodUID="59a8ba84-3285-4957-a3da-f719177e4c1a" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.547325 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548566 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548621 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548661 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548683 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548709 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548744 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzzm\" (UniqueName: \"kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548787 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548825 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548858 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548880 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk28z\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548895 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548917 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548943 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.548968 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.551541 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.552011 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.552103 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.555752 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.561217 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.563515 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.573808 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzzm\" (UniqueName: \"kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm\") pod \"horizon-66f597fc49-47d87\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.577948 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:13 crc kubenswrapper[4759]: E1205 01:42:13.587250 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-9tbxr logs scripts], unattached volumes=[], failed to process volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-9tbxr logs scripts]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="59a8ba84-3285-4957-a3da-f719177e4c1a" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.607143 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.626834 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.628891 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650794 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650847 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk28z\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650866 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650899 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650937 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.650963 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651065 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651083 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651103 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651128 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651152 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zsp\" (UniqueName: \"kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651172 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651205 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.651250 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.652089 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.656836 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.661105 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.675685 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.679612 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.680262 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.681264 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk28z\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.681975 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.690626 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.710234 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.710349 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.712491 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dbf1c346-6958-4849-8773-9d7b42b2c6fd","Type":"ContainerStarted","Data":"ca8980a90e67cf3113cd293e7a8e5a86d2113a098d3af16ee5c49dc73fe67989"} Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.713660 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.713718 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3772ce5b-f22d-4f9a-ad46-66923fae82be","Type":"ContainerStarted","Data":"25c093a7b98a7d9b93aad03a46bac2835dc7bd35858be70650f8278524ad3664"} Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.719438 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.723155 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.731842 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/glance-default-internal-api-0" oldPodUID="59a8ba84-3285-4957-a3da-f719177e4c1a" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753294 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753369 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccss6\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753409 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753428 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753447 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753467 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753504 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753555 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753580 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753598 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753626 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753649 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zsp\" (UniqueName: \"kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753672 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.753701 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.755277 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.756118 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.756612 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.759693 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.759929 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.768982 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.805154 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zsp\" (UniqueName: \"kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp\") pod \"horizon-8b95d7c69-zrnrb\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.824877 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861785 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861858 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccss6\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861896 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861915 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861935 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861958 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.861995 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.862048 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.862095 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.862657 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.864032 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.865278 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.868899 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.881987 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.882867 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.885389 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.885835 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.897601 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccss6\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.898110 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.912442 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.948516 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.964841 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:13 crc kubenswrapper[4759]: I1205 01:42:13.968492 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/glance-default-internal-api-0" oldPodUID="59a8ba84-3285-4957-a3da-f719177e4c1a" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.160895 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4rlm4"] Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.262535 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa0a-account-create-update-x559s"] Dec 05 01:42:14 crc kubenswrapper[4759]: W1205 01:42:14.308734 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6b8713d_0512_4172_ad78_b7bc92a1d9ba.slice/crio-5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c WatchSource:0}: Error finding container 5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c: Status 404 returned error can't find the container with id 5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c Dec 05 01:42:14 crc kubenswrapper[4759]: W1205 01:42:14.430723 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2e0ae6_c57c_4c91_a4eb_89619f25a19b.slice/crio-af7c0fb355897ef7f5b62670350811776bf74fec9753245eff972a2df0c58f84 WatchSource:0}: Error finding container af7c0fb355897ef7f5b62670350811776bf74fec9753245eff972a2df0c58f84: Status 404 returned error can't find the container with id af7c0fb355897ef7f5b62670350811776bf74fec9753245eff972a2df0c58f84 Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.432221 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.541879 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.659193 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.731281 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerStarted","Data":"af7c0fb355897ef7f5b62670350811776bf74fec9753245eff972a2df0c58f84"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.733047 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerStarted","Data":"39dd63290cd6674b56603cd7d3d9764866c991e57550b517b38bbc34a33376c3"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.735220 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerStarted","Data":"6408c9a0b46b1cc6985a3c0bc51e3ff086667ba7c9fb31336494824f79bc5cd0"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.736140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa0a-account-create-update-x559s" event={"ID":"c6b8713d-0512-4172-ad78-b7bc92a1d9ba","Type":"ContainerStarted","Data":"5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.737756 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.744997 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4rlm4" event={"ID":"996d6f16-b82f-4780-8d29-e26f633bd570","Type":"ContainerStarted","Data":"898cbc303a30965158b622e8d7400a1171c14af6e3e937db7aae492df35dcf51"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.745031 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4rlm4" event={"ID":"996d6f16-b82f-4780-8d29-e26f633bd570","Type":"ContainerStarted","Data":"c4074fa3dabf48867308fbf9982207a8d1112f51a1b96bd1a6d303728b5da2a5"} Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.765208 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/glance-default-internal-api-0" oldPodUID="59a8ba84-3285-4957-a3da-f719177e4c1a" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.773836 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-4rlm4" podStartSLOduration=1.7738174199999999 podStartE2EDuration="1.77381742s" podCreationTimestamp="2025-12-05 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:14.761289958 +0000 UTC m=+4753.976950898" watchObservedRunningTime="2025-12-05 01:42:14.77381742 +0000 UTC m=+4753.989478370" Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.808701 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:14 crc kubenswrapper[4759]: W1205 01:42:14.817146 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2266859_2cb8_4b8a_a25f_ebb94d002df3.slice/crio-aa44ff11099ad72a8f08a741300e3becd410b0bb8950933832381e522b5d091d WatchSource:0}: Error finding container aa44ff11099ad72a8f08a741300e3becd410b0bb8950933832381e522b5d091d: Status 404 returned error can't find the container with id aa44ff11099ad72a8f08a741300e3becd410b0bb8950933832381e522b5d091d Dec 05 01:42:14 crc kubenswrapper[4759]: I1205 01:42:14.962695 4759 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/glance-default-internal-api-0" oldPodUID="59a8ba84-3285-4957-a3da-f719177e4c1a" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.200197 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a8ba84-3285-4957-a3da-f719177e4c1a" path="/var/lib/kubelet/pods/59a8ba84-3285-4957-a3da-f719177e4c1a/volumes" Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.761991 4759 generic.go:334] "Generic (PLEG): container finished" podID="996d6f16-b82f-4780-8d29-e26f633bd570" containerID="898cbc303a30965158b622e8d7400a1171c14af6e3e937db7aae492df35dcf51" exitCode=0 Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.762660 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4rlm4" event={"ID":"996d6f16-b82f-4780-8d29-e26f633bd570","Type":"ContainerDied","Data":"898cbc303a30965158b622e8d7400a1171c14af6e3e937db7aae492df35dcf51"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.766077 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerStarted","Data":"ab5a9145f71f5c4b0ac99be1ee369a812e0005a96bb3a1fe8988f6a3f0990b8d"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.766379 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerStarted","Data":"aa44ff11099ad72a8f08a741300e3becd410b0bb8950933832381e522b5d091d"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.817827 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerStarted","Data":"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.828614 4759 generic.go:334] "Generic (PLEG): container finished" podID="c6b8713d-0512-4172-ad78-b7bc92a1d9ba" containerID="897967f4179143f090e9c302eccda8369c9d83a84f00e32eac29b842085f4d00" exitCode=0 Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.828779 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa0a-account-create-update-x559s" event={"ID":"c6b8713d-0512-4172-ad78-b7bc92a1d9ba","Type":"ContainerDied","Data":"897967f4179143f090e9c302eccda8369c9d83a84f00e32eac29b842085f4d00"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.843843 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dbf1c346-6958-4849-8773-9d7b42b2c6fd","Type":"ContainerStarted","Data":"53a21c8d231a1db83218b7edc891e2be34022dfe4b8f4a94bc62b303281f7244"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.843895 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"dbf1c346-6958-4849-8773-9d7b42b2c6fd","Type":"ContainerStarted","Data":"f91bb41182e3e77d1a446b959589b7f4395d6ab8bcd975898ca17c741ac4452f"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.858781 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3772ce5b-f22d-4f9a-ad46-66923fae82be","Type":"ContainerStarted","Data":"6ec03dcdc1918f61d1c261822c8bdb33a7ad5ef0e53abfa1bd37eb50fb1766e7"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.859418 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3772ce5b-f22d-4f9a-ad46-66923fae82be","Type":"ContainerStarted","Data":"63c99c59b23f3e8e595c04d698ce1bce76ef95326f03db2e3771e71f526132cb"} Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.930253 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.901251195 podStartE2EDuration="3.93022837s" podCreationTimestamp="2025-12-05 01:42:12 +0000 UTC" firstStartedPulling="2025-12-05 01:42:13.666129087 +0000 UTC m=+4752.881790037" lastFinishedPulling="2025-12-05 01:42:14.695106262 +0000 UTC m=+4753.910767212" observedRunningTime="2025-12-05 01:42:15.874533847 +0000 UTC m=+4755.090194797" watchObservedRunningTime="2025-12-05 01:42:15.93022837 +0000 UTC m=+4755.145889320" Dec 05 01:42:15 crc kubenswrapper[4759]: I1205 01:42:15.956533 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.124705266 podStartE2EDuration="3.956504694s" podCreationTimestamp="2025-12-05 01:42:12 +0000 UTC" firstStartedPulling="2025-12-05 01:42:13.530016873 +0000 UTC m=+4752.745677823" lastFinishedPulling="2025-12-05 01:42:14.361816301 +0000 UTC m=+4753.577477251" observedRunningTime="2025-12-05 01:42:15.897189953 +0000 UTC m=+4755.112850903" watchObservedRunningTime="2025-12-05 01:42:15.956504694 +0000 UTC m=+4755.172165654" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.710907 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.776039 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.784916 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.800057 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.831411 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.864573 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.882335 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.893915 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.893978 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4bt\" (UniqueName: \"kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.894024 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.894076 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.894132 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.894199 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.894353 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.909836 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerStarted","Data":"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e"} Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.916702 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dc499994d-vp27z"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.931202 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.938490 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.951121 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dc499994d-vp27z"] Dec 05 01:42:16 crc kubenswrapper[4759]: I1205 01:42:16.977133 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.977098727 podStartE2EDuration="3.977098727s" podCreationTimestamp="2025-12-05 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:16.943770542 +0000 UTC m=+4756.159431482" watchObservedRunningTime="2025-12-05 01:42:16.977098727 +0000 UTC m=+4756.192759677" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002550 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002602 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002640 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002723 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002791 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002817 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4bt\" (UniqueName: \"kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.002843 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.004043 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.006642 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.007716 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.023269 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.028098 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4bt\" (UniqueName: \"kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.029806 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.032584 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs\") pod \"horizon-759bc69744-82t58\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.106830 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brcxn\" (UniqueName: \"kubernetes.io/projected/81556b05-cd4e-407a-830f-e7e38962d519-kube-api-access-brcxn\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107430 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-scripts\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107476 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-combined-ca-bundle\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107534 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-config-data\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107597 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81556b05-cd4e-407a-830f-e7e38962d519-logs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107676 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-secret-key\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.107767 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-tls-certs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.137915 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212460 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-scripts\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212525 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-combined-ca-bundle\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212572 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-config-data\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212615 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81556b05-cd4e-407a-830f-e7e38962d519-logs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212672 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-secret-key\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212713 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-tls-certs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.212731 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brcxn\" (UniqueName: \"kubernetes.io/projected/81556b05-cd4e-407a-830f-e7e38962d519-kube-api-access-brcxn\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.213405 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-scripts\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.214183 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81556b05-cd4e-407a-830f-e7e38962d519-config-data\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.214645 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81556b05-cd4e-407a-830f-e7e38962d519-logs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.222859 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-combined-ca-bundle\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.233693 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-secret-key\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.250614 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brcxn\" (UniqueName: \"kubernetes.io/projected/81556b05-cd4e-407a-830f-e7e38962d519-kube-api-access-brcxn\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.260139 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/81556b05-cd4e-407a-830f-e7e38962d519-horizon-tls-certs\") pod \"horizon-6dc499994d-vp27z\" (UID: \"81556b05-cd4e-407a-830f-e7e38962d519\") " pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.260755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.729573 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.734055 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.758389 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.770992 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.871059 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts\") pod \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.871682 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts\") pod \"996d6f16-b82f-4780-8d29-e26f633bd570\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.871773 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmkhl\" (UniqueName: \"kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl\") pod \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\" (UID: \"c6b8713d-0512-4172-ad78-b7bc92a1d9ba\") " Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.871858 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck8mq\" (UniqueName: \"kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq\") pod \"996d6f16-b82f-4780-8d29-e26f633bd570\" (UID: \"996d6f16-b82f-4780-8d29-e26f633bd570\") " Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.872028 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6b8713d-0512-4172-ad78-b7bc92a1d9ba" (UID: "c6b8713d-0512-4172-ad78-b7bc92a1d9ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.872689 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.873365 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "996d6f16-b82f-4780-8d29-e26f633bd570" (UID: "996d6f16-b82f-4780-8d29-e26f633bd570"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.878909 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq" (OuterVolumeSpecName: "kube-api-access-ck8mq") pod "996d6f16-b82f-4780-8d29-e26f633bd570" (UID: "996d6f16-b82f-4780-8d29-e26f633bd570"). InnerVolumeSpecName "kube-api-access-ck8mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.935719 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl" (OuterVolumeSpecName: "kube-api-access-mmkhl") pod "c6b8713d-0512-4172-ad78-b7bc92a1d9ba" (UID: "c6b8713d-0512-4172-ad78-b7bc92a1d9ba"). InnerVolumeSpecName "kube-api-access-mmkhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.952960 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa0a-account-create-update-x559s" event={"ID":"c6b8713d-0512-4172-ad78-b7bc92a1d9ba","Type":"ContainerDied","Data":"5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c"} Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.953014 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ff2d5a9d9f6dbfa5b8912a83f8624cc134cb11690d9fdc3f9e9518a71bc901c" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.953094 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa0a-account-create-update-x559s" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.998861 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4rlm4" event={"ID":"996d6f16-b82f-4780-8d29-e26f633bd570","Type":"ContainerDied","Data":"c4074fa3dabf48867308fbf9982207a8d1112f51a1b96bd1a6d303728b5da2a5"} Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.999227 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4074fa3dabf48867308fbf9982207a8d1112f51a1b96bd1a6d303728b5da2a5" Dec 05 01:42:17 crc kubenswrapper[4759]: I1205 01:42:17.999347 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4rlm4" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.009591 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-log" containerID="cri-o://a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" gracePeriod=30 Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.010004 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-log" containerID="cri-o://ab5a9145f71f5c4b0ac99be1ee369a812e0005a96bb3a1fe8988f6a3f0990b8d" gracePeriod=30 Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.010069 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerStarted","Data":"bfe14cc60adfadad1c44a1d8100446f9aed511d0d82e0c99b9c1b156c23a52e9"} Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.012153 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-httpd" containerID="cri-o://27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" gracePeriod=30 Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.014459 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-httpd" containerID="cri-o://bfe14cc60adfadad1c44a1d8100446f9aed511d0d82e0c99b9c1b156c23a52e9" gracePeriod=30 Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.050224 4759 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/996d6f16-b82f-4780-8d29-e26f633bd570-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.050266 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmkhl\" (UniqueName: \"kubernetes.io/projected/c6b8713d-0512-4172-ad78-b7bc92a1d9ba-kube-api-access-mmkhl\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.050279 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck8mq\" (UniqueName: \"kubernetes.io/projected/996d6f16-b82f-4780-8d29-e26f633bd570-kube-api-access-ck8mq\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.054976 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.070969 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.070945986 podStartE2EDuration="5.070945986s" podCreationTimestamp="2025-12-05 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:18.04124 +0000 UTC m=+4757.256900950" watchObservedRunningTime="2025-12-05 01:42:18.070945986 +0000 UTC m=+4757.286606936" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.259076 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dc499994d-vp27z"] Dec 05 01:42:18 crc kubenswrapper[4759]: W1205 01:42:18.311295 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81556b05_cd4e_407a_830f_e7e38962d519.slice/crio-98b0d4e2d28b9033e9295e1358e3b43123724756b09822f37ab9da4dc2abe91c WatchSource:0}: Error finding container 98b0d4e2d28b9033e9295e1358e3b43123724756b09822f37ab9da4dc2abe91c: Status 404 returned error can't find the container with id 98b0d4e2d28b9033e9295e1358e3b43123724756b09822f37ab9da4dc2abe91c Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.831002 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971628 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971701 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971727 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971783 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971859 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971878 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971906 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk28z\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.971974 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data\") pod \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\" (UID: \"f6a08fce-525c-4b29-9f89-f1a533a4bc3d\") " Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.973010 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.977574 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs" (OuterVolumeSpecName: "logs") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.980691 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph" (OuterVolumeSpecName: "ceph") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.983427 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z" (OuterVolumeSpecName: "kube-api-access-vk28z") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "kube-api-access-vk28z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.983587 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 01:42:18 crc kubenswrapper[4759]: I1205 01:42:18.984645 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts" (OuterVolumeSpecName: "scripts") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.005595 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.022705 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dc499994d-vp27z" event={"ID":"81556b05-cd4e-407a-830f-e7e38962d519","Type":"ContainerStarted","Data":"98b0d4e2d28b9033e9295e1358e3b43123724756b09822f37ab9da4dc2abe91c"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025583 4759 generic.go:334] "Generic (PLEG): container finished" podID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerID="27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" exitCode=0 Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025605 4759 generic.go:334] "Generic (PLEG): container finished" podID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerID="a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" exitCode=143 Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025636 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerDied","Data":"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025653 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerDied","Data":"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025807 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.025901 4759 scope.go:117] "RemoveContainer" containerID="27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.026402 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a08fce-525c-4b29-9f89-f1a533a4bc3d","Type":"ContainerDied","Data":"39dd63290cd6674b56603cd7d3d9764866c991e57550b517b38bbc34a33376c3"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.028056 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerStarted","Data":"d460350139f3d98f3f210b0411cf5080185deab396814346e1f7e110b938185d"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.034133 4759 generic.go:334] "Generic (PLEG): container finished" podID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerID="bfe14cc60adfadad1c44a1d8100446f9aed511d0d82e0c99b9c1b156c23a52e9" exitCode=0 Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.034166 4759 generic.go:334] "Generic (PLEG): container finished" podID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerID="ab5a9145f71f5c4b0ac99be1ee369a812e0005a96bb3a1fe8988f6a3f0990b8d" exitCode=143 Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.034188 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerDied","Data":"bfe14cc60adfadad1c44a1d8100446f9aed511d0d82e0c99b9c1b156c23a52e9"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.034216 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerDied","Data":"ab5a9145f71f5c4b0ac99be1ee369a812e0005a96bb3a1fe8988f6a3f0990b8d"} Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.057745 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data" (OuterVolumeSpecName: "config-data") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.062371 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f6a08fce-525c-4b29-9f89-f1a533a4bc3d" (UID: "f6a08fce-525c-4b29-9f89-f1a533a4bc3d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.073886 4759 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074081 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074140 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074220 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074294 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074388 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk28z\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-kube-api-access-vk28z\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074443 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074501 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.074559 4759 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a08fce-525c-4b29-9f89-f1a533a4bc3d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.098563 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.134460 4759 scope.go:117] "RemoveContainer" containerID="a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.164468 4759 scope.go:117] "RemoveContainer" containerID="27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.164927 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e\": container with ID starting with 27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e not found: ID does not exist" containerID="27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.164957 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e"} err="failed to get container status \"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e\": rpc error: code = NotFound desc = could not find container \"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e\": container with ID starting with 27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e not found: ID does not exist" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.164979 4759 scope.go:117] "RemoveContainer" containerID="a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.165870 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20\": container with ID starting with a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20 not found: ID does not exist" containerID="a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.165894 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20"} err="failed to get container status \"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20\": rpc error: code = NotFound desc = could not find container \"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20\": container with ID starting with a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20 not found: ID does not exist" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.165907 4759 scope.go:117] "RemoveContainer" containerID="27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.166343 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e"} err="failed to get container status \"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e\": rpc error: code = NotFound desc = could not find container \"27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e\": container with ID starting with 27359f87f443b253f17db654a545bf6cdfc29c1ce2ab5cdead01b412a8a9a88e not found: ID does not exist" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.166362 4759 scope.go:117] "RemoveContainer" containerID="a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.167623 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20"} err="failed to get container status \"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20\": rpc error: code = NotFound desc = could not find container \"a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20\": container with ID starting with a43a17c777d5fbedc863174475ac68000d7871bbdfd39e87c078a8c642547a20 not found: ID does not exist" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.176370 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.359703 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.369331 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406161 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.406662 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b8713d-0512-4172-ad78-b7bc92a1d9ba" containerName="mariadb-account-create-update" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406676 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b8713d-0512-4172-ad78-b7bc92a1d9ba" containerName="mariadb-account-create-update" Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.406686 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-httpd" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406693 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-httpd" Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.406713 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996d6f16-b82f-4780-8d29-e26f633bd570" containerName="mariadb-database-create" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406720 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="996d6f16-b82f-4780-8d29-e26f633bd570" containerName="mariadb-database-create" Dec 05 01:42:19 crc kubenswrapper[4759]: E1205 01:42:19.406735 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-log" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406741 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-log" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406955 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-log" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406967 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="996d6f16-b82f-4780-8d29-e26f633bd570" containerName="mariadb-database-create" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.406996 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" containerName="glance-httpd" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.407009 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b8713d-0512-4172-ad78-b7bc92a1d9ba" containerName="mariadb-account-create-update" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.408238 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.413402 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.424837 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.428381 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.498775 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.498860 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cppq6\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-kube-api-access-cppq6\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.498936 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.498962 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-ceph\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.498998 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.499018 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.499052 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-logs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.499084 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.499121 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.600669 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601081 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-ceph\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601208 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601245 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601350 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-logs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601388 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.601421 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.604009 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.604179 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cppq6\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-kube-api-access-cppq6\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.605418 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.606173 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-logs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.606988 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29f7592-d4e5-4a46-bcdd-b52666d8e689-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.608854 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.611256 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.611439 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.611710 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-ceph\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.612618 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29f7592-d4e5-4a46-bcdd-b52666d8e689-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.623219 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cppq6\" (UniqueName: \"kubernetes.io/projected/e29f7592-d4e5-4a46-bcdd-b52666d8e689-kube-api-access-cppq6\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.653645 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e29f7592-d4e5-4a46-bcdd-b52666d8e689\") " pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.729416 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.748511 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910256 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910329 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910414 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910432 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910462 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccss6\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910536 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910606 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910648 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.910710 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph\") pod \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\" (UID: \"e2266859-2cb8-4b8a-a25f-ebb94d002df3\") " Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.915378 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs" (OuterVolumeSpecName: "logs") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:19 crc kubenswrapper[4759]: I1205 01:42:19.915893 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.015690 4759 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.015782 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2266859-2cb8-4b8a-a25f-ebb94d002df3-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.060165 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2266859-2cb8-4b8a-a25f-ebb94d002df3","Type":"ContainerDied","Data":"aa44ff11099ad72a8f08a741300e3becd410b0bb8950933832381e522b5d091d"} Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.060260 4759 scope.go:117] "RemoveContainer" containerID="bfe14cc60adfadad1c44a1d8100446f9aed511d0d82e0c99b9c1b156c23a52e9" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.060622 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.267932 4759 scope.go:117] "RemoveContainer" containerID="ab5a9145f71f5c4b0ac99be1ee369a812e0005a96bb3a1fe8988f6a3f0990b8d" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.371762 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.372238 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts" (OuterVolumeSpecName: "scripts") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.372657 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph" (OuterVolumeSpecName: "ceph") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.384292 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6" (OuterVolumeSpecName: "kube-api-access-ccss6") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "kube-api-access-ccss6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.425540 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.427650 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.427682 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.427721 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.427734 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccss6\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-kube-api-access-ccss6\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.427746 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2266859-2cb8-4b8a-a25f-ebb94d002df3-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.474694 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.477435 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data" (OuterVolumeSpecName: "config-data") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.494023 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2266859-2cb8-4b8a-a25f-ebb94d002df3" (UID: "e2266859-2cb8-4b8a-a25f-ebb94d002df3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.531361 4759 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.531721 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.531732 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2266859-2cb8-4b8a-a25f-ebb94d002df3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.721021 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.734964 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.766220 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:20 crc kubenswrapper[4759]: E1205 01:42:20.767532 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-log" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.767559 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-log" Dec 05 01:42:20 crc kubenswrapper[4759]: E1205 01:42:20.767603 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-httpd" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.767610 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-httpd" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.768077 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-log" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.768127 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" containerName="glance-httpd" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.770866 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.779456 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.779753 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.818784 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.855707 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.855804 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.855833 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.855916 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.855949 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.856041 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.856078 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.856118 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfhw\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-kube-api-access-qhfhw\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.856144 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.957722 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958110 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958163 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958188 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958247 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958267 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958347 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958379 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958407 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfhw\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-kube-api-access-qhfhw\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.958499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.959342 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.959408 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44ace8b-2e47-4682-bcea-3626f840d31b-logs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.965138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.965771 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.966685 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.970848 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.974131 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44ace8b-2e47-4682-bcea-3626f840d31b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:20 crc kubenswrapper[4759]: I1205 01:42:20.977890 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfhw\" (UniqueName: \"kubernetes.io/projected/c44ace8b-2e47-4682-bcea-3626f840d31b-kube-api-access-qhfhw\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.024249 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c44ace8b-2e47-4682-bcea-3626f840d31b\") " pod="openstack/glance-default-internal-api-0" Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.070131 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 01:42:21 crc kubenswrapper[4759]: W1205 01:42:21.073449 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29f7592_d4e5_4a46_bcdd_b52666d8e689.slice/crio-59a4499aa3ae94ac4039310ddefd86a52873d6adde9c41398e07b323a5be986c WatchSource:0}: Error finding container 59a4499aa3ae94ac4039310ddefd86a52873d6adde9c41398e07b323a5be986c: Status 404 returned error can't find the container with id 59a4499aa3ae94ac4039310ddefd86a52873d6adde9c41398e07b323a5be986c Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.092547 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29f7592-d4e5-4a46-bcdd-b52666d8e689","Type":"ContainerStarted","Data":"59a4499aa3ae94ac4039310ddefd86a52873d6adde9c41398e07b323a5be986c"} Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.098010 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.186529 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2266859-2cb8-4b8a-a25f-ebb94d002df3" path="/var/lib/kubelet/pods/e2266859-2cb8-4b8a-a25f-ebb94d002df3/volumes" Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.188647 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a08fce-525c-4b29-9f89-f1a533a4bc3d" path="/var/lib/kubelet/pods/f6a08fce-525c-4b29-9f89-f1a533a4bc3d/volumes" Dec 05 01:42:21 crc kubenswrapper[4759]: I1205 01:42:21.822188 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 01:42:22 crc kubenswrapper[4759]: I1205 01:42:22.107542 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29f7592-d4e5-4a46-bcdd-b52666d8e689","Type":"ContainerStarted","Data":"0dff8edf66356dd401e6c5b0d3d5042a140fab1b39b3003a76ae1e1687579a70"} Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.013350 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.047910 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.754363 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-ppb4v"] Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.757603 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.782769 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.783050 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-44nxb" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.815394 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-ppb4v"] Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.839029 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.839232 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.839350 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.839388 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: W1205 01:42:23.858696 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44ace8b_2e47_4682_bcea_3626f840d31b.slice/crio-2ad1355eb1f48289d159f84dc3556f6cb69ebb79249c8bf21a51e85d7babfba1 WatchSource:0}: Error finding container 2ad1355eb1f48289d159f84dc3556f6cb69ebb79249c8bf21a51e85d7babfba1: Status 404 returned error can't find the container with id 2ad1355eb1f48289d159f84dc3556f6cb69ebb79249c8bf21a51e85d7babfba1 Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.942752 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.943060 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.943179 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.943242 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.951863 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.952727 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.958980 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:23 crc kubenswrapper[4759]: I1205 01:42:23.962049 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv\") pod \"manila-db-sync-ppb4v\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:24 crc kubenswrapper[4759]: I1205 01:42:24.046826 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:24 crc kubenswrapper[4759]: I1205 01:42:24.151814 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c44ace8b-2e47-4682-bcea-3626f840d31b","Type":"ContainerStarted","Data":"2ad1355eb1f48289d159f84dc3556f6cb69ebb79249c8bf21a51e85d7babfba1"} Dec 05 01:42:29 crc kubenswrapper[4759]: I1205 01:42:29.428569 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-ppb4v"] Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.218384 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ppb4v" event={"ID":"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6","Type":"ContainerStarted","Data":"efb1eaf6b54d6e2e22c92fa165b6ae3703ee031cfd4cdf2fc584681578719cb1"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.220451 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerStarted","Data":"6f520c4e40e02037b993080740d980a9e2f7226d3295785deb93f4251767b602"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.220492 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerStarted","Data":"205bce0544bc92bfe30c54987cdc66e7c0fa7d8cd956564468042da00eded278"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.220597 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f597fc49-47d87" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon-log" containerID="cri-o://205bce0544bc92bfe30c54987cdc66e7c0fa7d8cd956564468042da00eded278" gracePeriod=30 Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.221112 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f597fc49-47d87" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon" containerID="cri-o://6f520c4e40e02037b993080740d980a9e2f7226d3295785deb93f4251767b602" gracePeriod=30 Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.224400 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dc499994d-vp27z" event={"ID":"81556b05-cd4e-407a-830f-e7e38962d519","Type":"ContainerStarted","Data":"e945db0bfa1fb35b3cbeaf090bb77cee60379e117277a1211815d872ffc99612"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.224438 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dc499994d-vp27z" event={"ID":"81556b05-cd4e-407a-830f-e7e38962d519","Type":"ContainerStarted","Data":"55a599fb1faf0393447100ffa2c8dd50972a38c5e3b7045b5d90a076c6d5b2e7"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.227099 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerStarted","Data":"ed7d073e56a2beccb4ba930b74418c9e723c0649cc262296a634b84683583cb5"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.227143 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerStarted","Data":"077098bfcaabc3263eb49605a4bc5d96435edaee752a375d4c414f28395e4b72"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.227171 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8b95d7c69-zrnrb" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon-log" containerID="cri-o://077098bfcaabc3263eb49605a4bc5d96435edaee752a375d4c414f28395e4b72" gracePeriod=30 Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.227185 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8b95d7c69-zrnrb" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon" containerID="cri-o://ed7d073e56a2beccb4ba930b74418c9e723c0649cc262296a634b84683583cb5" gracePeriod=30 Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.229186 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c44ace8b-2e47-4682-bcea-3626f840d31b","Type":"ContainerStarted","Data":"5ff047f55e4304f62e6aad0ae5c432df92f84e92b652b9cae89a50e174e3cee4"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.231401 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29f7592-d4e5-4a46-bcdd-b52666d8e689","Type":"ContainerStarted","Data":"dc26c0634617b56c7b8a01422783ecf27c1cab833cf0474c689450be188bdf75"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.242532 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerStarted","Data":"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.242579 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerStarted","Data":"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57"} Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.274294 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66f597fc49-47d87" podStartSLOduration=2.830689368 podStartE2EDuration="17.273784829s" podCreationTimestamp="2025-12-05 01:42:13 +0000 UTC" firstStartedPulling="2025-12-05 01:42:14.472683205 +0000 UTC m=+4753.688344155" lastFinishedPulling="2025-12-05 01:42:28.915778646 +0000 UTC m=+4768.131439616" observedRunningTime="2025-12-05 01:42:30.238153599 +0000 UTC m=+4769.453814549" watchObservedRunningTime="2025-12-05 01:42:30.273784829 +0000 UTC m=+4769.489445779" Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.297826 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.297806958 podStartE2EDuration="11.297806958s" podCreationTimestamp="2025-12-05 01:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:30.261610265 +0000 UTC m=+4769.477271225" watchObservedRunningTime="2025-12-05 01:42:30.297806958 +0000 UTC m=+4769.513467908" Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.301586 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8b95d7c69-zrnrb" podStartSLOduration=2.908049115 podStartE2EDuration="17.301570809s" podCreationTimestamp="2025-12-05 01:42:13 +0000 UTC" firstStartedPulling="2025-12-05 01:42:14.693000831 +0000 UTC m=+4753.908661781" lastFinishedPulling="2025-12-05 01:42:29.086522525 +0000 UTC m=+4768.302183475" observedRunningTime="2025-12-05 01:42:30.283037222 +0000 UTC m=+4769.498698172" watchObservedRunningTime="2025-12-05 01:42:30.301570809 +0000 UTC m=+4769.517231759" Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.320832 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6dc499994d-vp27z" podStartSLOduration=3.694507766 podStartE2EDuration="14.320816163s" podCreationTimestamp="2025-12-05 01:42:16 +0000 UTC" firstStartedPulling="2025-12-05 01:42:18.322434724 +0000 UTC m=+4757.538095674" lastFinishedPulling="2025-12-05 01:42:28.948743111 +0000 UTC m=+4768.164404071" observedRunningTime="2025-12-05 01:42:30.2983015 +0000 UTC m=+4769.513962450" watchObservedRunningTime="2025-12-05 01:42:30.320816163 +0000 UTC m=+4769.536477113" Dec 05 01:42:30 crc kubenswrapper[4759]: I1205 01:42:30.334507 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-759bc69744-82t58" podStartSLOduration=3.38299627 podStartE2EDuration="14.334490883s" podCreationTimestamp="2025-12-05 01:42:16 +0000 UTC" firstStartedPulling="2025-12-05 01:42:17.999110613 +0000 UTC m=+4757.214771563" lastFinishedPulling="2025-12-05 01:42:28.950605226 +0000 UTC m=+4768.166266176" observedRunningTime="2025-12-05 01:42:30.319805439 +0000 UTC m=+4769.535466399" watchObservedRunningTime="2025-12-05 01:42:30.334490883 +0000 UTC m=+4769.550151833" Dec 05 01:42:31 crc kubenswrapper[4759]: I1205 01:42:31.259486 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c44ace8b-2e47-4682-bcea-3626f840d31b","Type":"ContainerStarted","Data":"9d58cf893279aafda98c8eebcb863e4f31baefd185c18648eeaf4d2ea830b57b"} Dec 05 01:42:31 crc kubenswrapper[4759]: I1205 01:42:31.308454 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.30843384 podStartE2EDuration="11.30843384s" podCreationTimestamp="2025-12-05 01:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:31.301411651 +0000 UTC m=+4770.517072621" watchObservedRunningTime="2025-12-05 01:42:31.30843384 +0000 UTC m=+4770.524094790" Dec 05 01:42:33 crc kubenswrapper[4759]: I1205 01:42:33.629769 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:42:33 crc kubenswrapper[4759]: I1205 01:42:33.898682 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:42:34 crc kubenswrapper[4759]: I1205 01:42:34.433593 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:42:34 crc kubenswrapper[4759]: I1205 01:42:34.433667 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:42:34 crc kubenswrapper[4759]: I1205 01:42:34.433720 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:42:34 crc kubenswrapper[4759]: I1205 01:42:34.434351 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:42:34 crc kubenswrapper[4759]: I1205 01:42:34.434451 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7" gracePeriod=600 Dec 05 01:42:35 crc kubenswrapper[4759]: I1205 01:42:35.309933 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7" exitCode=0 Dec 05 01:42:35 crc kubenswrapper[4759]: I1205 01:42:35.310020 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7"} Dec 05 01:42:35 crc kubenswrapper[4759]: I1205 01:42:35.310320 4759 scope.go:117] "RemoveContainer" containerID="2cb0fe1a2d6fedc07135d5d9ddd3da960fddfbd3410b455c9e28a0d5d95b23bc" Dec 05 01:42:36 crc kubenswrapper[4759]: I1205 01:42:36.322816 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ppb4v" event={"ID":"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6","Type":"ContainerStarted","Data":"149e653f61d81e82c03299e1eabddf754805cd1e7e74616750366bb31a2d95ab"} Dec 05 01:42:36 crc kubenswrapper[4759]: I1205 01:42:36.326103 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805"} Dec 05 01:42:36 crc kubenswrapper[4759]: I1205 01:42:36.347913 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-ppb4v" podStartSLOduration=7.713063587 podStartE2EDuration="13.347894431s" podCreationTimestamp="2025-12-05 01:42:23 +0000 UTC" firstStartedPulling="2025-12-05 01:42:29.458887459 +0000 UTC m=+4768.674548419" lastFinishedPulling="2025-12-05 01:42:35.093718313 +0000 UTC m=+4774.309379263" observedRunningTime="2025-12-05 01:42:36.344788716 +0000 UTC m=+4775.560449666" watchObservedRunningTime="2025-12-05 01:42:36.347894431 +0000 UTC m=+4775.563555381" Dec 05 01:42:37 crc kubenswrapper[4759]: I1205 01:42:37.141610 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:37 crc kubenswrapper[4759]: I1205 01:42:37.143491 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:37 crc kubenswrapper[4759]: I1205 01:42:37.261713 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:37 crc kubenswrapper[4759]: I1205 01:42:37.262319 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:42:39 crc kubenswrapper[4759]: I1205 01:42:39.729862 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 01:42:39 crc kubenswrapper[4759]: I1205 01:42:39.730631 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 01:42:39 crc kubenswrapper[4759]: I1205 01:42:39.777593 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 01:42:39 crc kubenswrapper[4759]: I1205 01:42:39.796547 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 01:42:40 crc kubenswrapper[4759]: I1205 01:42:40.385645 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 01:42:40 crc kubenswrapper[4759]: I1205 01:42:40.385972 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.098760 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.098811 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.129476 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.173687 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.395842 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:41 crc kubenswrapper[4759]: I1205 01:42:41.395891 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:42 crc kubenswrapper[4759]: I1205 01:42:42.811089 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 01:42:42 crc kubenswrapper[4759]: I1205 01:42:42.811488 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 01:42:42 crc kubenswrapper[4759]: I1205 01:42:42.925259 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 01:42:43 crc kubenswrapper[4759]: I1205 01:42:43.499131 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:43 crc kubenswrapper[4759]: I1205 01:42:43.499545 4759 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 01:42:43 crc kubenswrapper[4759]: I1205 01:42:43.665636 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 01:42:46 crc kubenswrapper[4759]: I1205 01:42:46.463790 4759 generic.go:334] "Generic (PLEG): container finished" podID="6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" containerID="149e653f61d81e82c03299e1eabddf754805cd1e7e74616750366bb31a2d95ab" exitCode=0 Dec 05 01:42:46 crc kubenswrapper[4759]: I1205 01:42:46.464278 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ppb4v" event={"ID":"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6","Type":"ContainerDied","Data":"149e653f61d81e82c03299e1eabddf754805cd1e7e74616750366bb31a2d95ab"} Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.145941 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.65:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.65:8443: connect: connection refused" Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.264738 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dc499994d-vp27z" podUID="81556b05-cd4e-407a-830f-e7e38962d519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.66:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.66:8443: connect: connection refused" Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.962783 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.986917 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data\") pod \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.987063 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle\") pod \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.987103 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data\") pod \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.987182 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv\") pod \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\" (UID: \"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6\") " Dec 05 01:42:47 crc kubenswrapper[4759]: I1205 01:42:47.995681 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv" (OuterVolumeSpecName: "kube-api-access-cm6pv") pod "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" (UID: "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6"). InnerVolumeSpecName "kube-api-access-cm6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:47.999170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data" (OuterVolumeSpecName: "config-data") pod "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" (UID: "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.013473 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" (UID: "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.034300 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" (UID: "6457d70c-b3df-4b29-9d07-9f4ebc24b1b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.089154 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.089187 4759 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.089200 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm6pv\" (UniqueName: \"kubernetes.io/projected/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-kube-api-access-cm6pv\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.089215 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.526385 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-ppb4v" event={"ID":"6457d70c-b3df-4b29-9d07-9f4ebc24b1b6","Type":"ContainerDied","Data":"efb1eaf6b54d6e2e22c92fa165b6ae3703ee031cfd4cdf2fc584681578719cb1"} Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.526436 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb1eaf6b54d6e2e22c92fa165b6ae3703ee031cfd4cdf2fc584681578719cb1" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.526505 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-ppb4v" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.723394 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:42:48 crc kubenswrapper[4759]: E1205 01:42:48.724219 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" containerName="manila-db-sync" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.724235 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" containerName="manila-db-sync" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.724501 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" containerName="manila-db-sync" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.725651 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.736969 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.737029 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-44nxb" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.737478 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.737548 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.741546 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.743408 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.748974 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.786500 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.816293 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917197 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917257 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55q64\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917284 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917325 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917408 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917438 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917468 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917493 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vvw8\" (UniqueName: \"kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917517 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917532 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917547 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917586 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.917636 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.963369 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c6cf8d999-rmpvx"] Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.965293 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:48 crc kubenswrapper[4759]: I1205 01:42:48.978240 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6cf8d999-rmpvx"] Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020053 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020104 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-svc\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020146 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020162 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020190 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tk8s\" (UniqueName: \"kubernetes.io/projected/afdea169-3f66-4ad6-be4e-755db23f6a50-kube-api-access-9tk8s\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020213 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020246 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55q64\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020265 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020280 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020329 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020366 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-swift-storage-0\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020381 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020418 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020435 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020455 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020483 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020509 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vvw8\" (UniqueName: \"kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020530 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020545 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020558 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.020576 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-config\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.035177 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.035371 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.035417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.037227 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.045417 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.047297 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.047951 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.049510 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.063977 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.064433 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.064739 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.080749 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.082865 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55q64\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64\") pod \"manila-share-share1-0\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.083399 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.139122 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.155167 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vvw8\" (UniqueName: \"kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8\") pod \"manila-scheduler-0\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.182507 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-swift-storage-0\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.184184 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.183819 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-swift-storage-0\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.184467 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.184697 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-config\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.187378 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-svc\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.187504 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.187555 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tk8s\" (UniqueName: \"kubernetes.io/projected/afdea169-3f66-4ad6-be4e-755db23f6a50-kube-api-access-9tk8s\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.188628 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-dns-svc\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.185590 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-config\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.186237 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.191207 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.192190 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/afdea169-3f66-4ad6-be4e-755db23f6a50-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.207003 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.212007 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.228384 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tk8s\" (UniqueName: \"kubernetes.io/projected/afdea169-3f66-4ad6-be4e-755db23f6a50-kube-api-access-9tk8s\") pod \"dnsmasq-dns-6c6cf8d999-rmpvx\" (UID: \"afdea169-3f66-4ad6-be4e-755db23f6a50\") " pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.315732 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322211 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322328 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322385 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322431 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pss\" (UniqueName: \"kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322486 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322523 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.322552 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.367885 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.371931 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438009 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438194 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438293 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pss\" (UniqueName: \"kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438369 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438440 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438499 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.438588 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.441450 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.442261 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.453227 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.456661 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.457365 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.457845 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pss\" (UniqueName: \"kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.465827 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom\") pod \"manila-api-0\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.694087 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:49 crc kubenswrapper[4759]: I1205 01:42:49.922123 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.055115 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6cf8d999-rmpvx"] Dec 05 01:42:50 crc kubenswrapper[4759]: W1205 01:42:50.068284 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafdea169_3f66_4ad6_be4e_755db23f6a50.slice/crio-0099c6d42fc62e22476dd0d690632a5b49a605d3af0815f8f91b44e16b8d8caa WatchSource:0}: Error finding container 0099c6d42fc62e22476dd0d690632a5b49a605d3af0815f8f91b44e16b8d8caa: Status 404 returned error can't find the container with id 0099c6d42fc62e22476dd0d690632a5b49a605d3af0815f8f91b44e16b8d8caa Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.262590 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.497345 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:50 crc kubenswrapper[4759]: W1205 01:42:50.501446 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204b6e54_6e90_4898_b6a8_257780287758.slice/crio-bc8f28f1b9295b51eb4bd7fdc828f86222bd071f49ea1818f9c1af6a309022fc WatchSource:0}: Error finding container bc8f28f1b9295b51eb4bd7fdc828f86222bd071f49ea1818f9c1af6a309022fc: Status 404 returned error can't find the container with id bc8f28f1b9295b51eb4bd7fdc828f86222bd071f49ea1818f9c1af6a309022fc Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.629477 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerStarted","Data":"bc8f28f1b9295b51eb4bd7fdc828f86222bd071f49ea1818f9c1af6a309022fc"} Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.631381 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerStarted","Data":"dcb604eb0593c58dc324a30b3ded7bd930ace991217a61bdca6aafc75a578098"} Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.633205 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerStarted","Data":"3b6d0da6b6587c9c03409a5ecdd26c166cb716e9317c8931754812dd03b568d1"} Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.647834 4759 generic.go:334] "Generic (PLEG): container finished" podID="afdea169-3f66-4ad6-be4e-755db23f6a50" containerID="a0b230e711cec4ef8cc1f2a49dfbc0f87506d676a7700264a08bc432ba06c284" exitCode=0 Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.647880 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" event={"ID":"afdea169-3f66-4ad6-be4e-755db23f6a50","Type":"ContainerDied","Data":"a0b230e711cec4ef8cc1f2a49dfbc0f87506d676a7700264a08bc432ba06c284"} Dec 05 01:42:50 crc kubenswrapper[4759]: I1205 01:42:50.647906 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" event={"ID":"afdea169-3f66-4ad6-be4e-755db23f6a50","Type":"ContainerStarted","Data":"0099c6d42fc62e22476dd0d690632a5b49a605d3af0815f8f91b44e16b8d8caa"} Dec 05 01:42:51 crc kubenswrapper[4759]: I1205 01:42:51.678523 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" event={"ID":"afdea169-3f66-4ad6-be4e-755db23f6a50","Type":"ContainerStarted","Data":"ab5aeff118caacd390ca540874adb902c6c8b355819c9a062b107dc2b2894403"} Dec 05 01:42:51 crc kubenswrapper[4759]: I1205 01:42:51.679505 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:51 crc kubenswrapper[4759]: I1205 01:42:51.697492 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerStarted","Data":"82e06aa613fa901364096bf245fbd0c813d94ee5f5efe98a482f42dd0ac1ea6d"} Dec 05 01:42:51 crc kubenswrapper[4759]: I1205 01:42:51.709134 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" podStartSLOduration=3.709113902 podStartE2EDuration="3.709113902s" podCreationTimestamp="2025-12-05 01:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:51.703892525 +0000 UTC m=+4790.919553475" watchObservedRunningTime="2025-12-05 01:42:51.709113902 +0000 UTC m=+4790.924774852" Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.513872 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.741584 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerStarted","Data":"27556903511d415ef02ef48e40e143d700e345be456472317f6eb7c6b608210e"} Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.741621 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerStarted","Data":"9fbd8745049c803f7cc194f3ddf198bfd5b2f8ea4d0a2bf3ee1a32bedee28bcf"} Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.748872 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerStarted","Data":"86dfb765cc73a746ee3ae88fb18f235734328924d30ed5a4ab1bb96af820aac3"} Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.749257 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.779922 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.027139233 podStartE2EDuration="4.779906315s" podCreationTimestamp="2025-12-05 01:42:48 +0000 UTC" firstStartedPulling="2025-12-05 01:42:50.283784514 +0000 UTC m=+4789.499445464" lastFinishedPulling="2025-12-05 01:42:51.036551596 +0000 UTC m=+4790.252212546" observedRunningTime="2025-12-05 01:42:52.771491642 +0000 UTC m=+4791.987152592" watchObservedRunningTime="2025-12-05 01:42:52.779906315 +0000 UTC m=+4791.995567265" Dec 05 01:42:52 crc kubenswrapper[4759]: I1205 01:42:52.811595 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.81157461 podStartE2EDuration="3.81157461s" podCreationTimestamp="2025-12-05 01:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:42:52.798661217 +0000 UTC m=+4792.014322177" watchObservedRunningTime="2025-12-05 01:42:52.81157461 +0000 UTC m=+4792.027235560" Dec 05 01:42:53 crc kubenswrapper[4759]: I1205 01:42:53.757965 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api-log" containerID="cri-o://82e06aa613fa901364096bf245fbd0c813d94ee5f5efe98a482f42dd0ac1ea6d" gracePeriod=30 Dec 05 01:42:53 crc kubenswrapper[4759]: I1205 01:42:53.758432 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api" containerID="cri-o://86dfb765cc73a746ee3ae88fb18f235734328924d30ed5a4ab1bb96af820aac3" gracePeriod=30 Dec 05 01:42:54 crc kubenswrapper[4759]: I1205 01:42:54.774377 4759 generic.go:334] "Generic (PLEG): container finished" podID="204b6e54-6e90-4898-b6a8-257780287758" containerID="86dfb765cc73a746ee3ae88fb18f235734328924d30ed5a4ab1bb96af820aac3" exitCode=0 Dec 05 01:42:54 crc kubenswrapper[4759]: I1205 01:42:54.774983 4759 generic.go:334] "Generic (PLEG): container finished" podID="204b6e54-6e90-4898-b6a8-257780287758" containerID="82e06aa613fa901364096bf245fbd0c813d94ee5f5efe98a482f42dd0ac1ea6d" exitCode=143 Dec 05 01:42:54 crc kubenswrapper[4759]: I1205 01:42:54.775013 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerDied","Data":"86dfb765cc73a746ee3ae88fb18f235734328924d30ed5a4ab1bb96af820aac3"} Dec 05 01:42:54 crc kubenswrapper[4759]: I1205 01:42:54.775044 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerDied","Data":"82e06aa613fa901364096bf245fbd0c813d94ee5f5efe98a482f42dd0ac1ea6d"} Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.100752 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160188 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5pss\" (UniqueName: \"kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160397 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160514 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160571 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160617 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160663 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160656 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.160757 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs\") pod \"204b6e54-6e90-4898-b6a8-257780287758\" (UID: \"204b6e54-6e90-4898-b6a8-257780287758\") " Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.161223 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs" (OuterVolumeSpecName: "logs") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.161394 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204b6e54-6e90-4898-b6a8-257780287758-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.173288 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss" (OuterVolumeSpecName: "kube-api-access-m5pss") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "kube-api-access-m5pss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.174332 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts" (OuterVolumeSpecName: "scripts") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.191670 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.204819 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.251357 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data" (OuterVolumeSpecName: "config-data") pod "204b6e54-6e90-4898-b6a8-257780287758" (UID: "204b6e54-6e90-4898-b6a8-257780287758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263753 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263803 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263815 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263822 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b6e54-6e90-4898-b6a8-257780287758-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263832 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b6e54-6e90-4898-b6a8-257780287758-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.263842 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5pss\" (UniqueName: \"kubernetes.io/projected/204b6e54-6e90-4898-b6a8-257780287758-kube-api-access-m5pss\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.790697 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"204b6e54-6e90-4898-b6a8-257780287758","Type":"ContainerDied","Data":"bc8f28f1b9295b51eb4bd7fdc828f86222bd071f49ea1818f9c1af6a309022fc"} Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.790756 4759 scope.go:117] "RemoveContainer" containerID="86dfb765cc73a746ee3ae88fb18f235734328924d30ed5a4ab1bb96af820aac3" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.790830 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.835533 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.855436 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.870136 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:55 crc kubenswrapper[4759]: E1205 01:42:55.870689 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api-log" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.870709 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api-log" Dec 05 01:42:55 crc kubenswrapper[4759]: E1205 01:42:55.870737 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.870744 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.870981 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api-log" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.871006 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="204b6e54-6e90-4898-b6a8-257780287758" containerName="manila-api" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.872260 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.882206 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.882409 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.882523 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.886368 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.985675 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-public-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986099 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-scripts\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986144 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986175 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8dtw\" (UniqueName: \"kubernetes.io/projected/1acbfd13-8d88-4169-b1f6-098a33b9cc15-kube-api-access-m8dtw\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986202 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data-custom\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986230 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-internal-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986291 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1acbfd13-8d88-4169-b1f6-098a33b9cc15-etc-machine-id\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986330 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:55 crc kubenswrapper[4759]: I1205 01:42:55.986389 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acbfd13-8d88-4169-b1f6-098a33b9cc15-logs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.087926 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-public-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088039 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-scripts\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088107 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8dtw\" (UniqueName: \"kubernetes.io/projected/1acbfd13-8d88-4169-b1f6-098a33b9cc15-kube-api-access-m8dtw\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088132 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data-custom\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088159 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-internal-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088192 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1acbfd13-8d88-4169-b1f6-098a33b9cc15-etc-machine-id\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088213 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088231 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acbfd13-8d88-4169-b1f6-098a33b9cc15-logs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088676 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1acbfd13-8d88-4169-b1f6-098a33b9cc15-logs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.088731 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1acbfd13-8d88-4169-b1f6-098a33b9cc15-etc-machine-id\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.093455 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-scripts\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.093694 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-internal-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.093846 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-public-tls-certs\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.094428 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.094656 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data-custom\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.097390 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acbfd13-8d88-4169-b1f6-098a33b9cc15-config-data\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.110102 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8dtw\" (UniqueName: \"kubernetes.io/projected/1acbfd13-8d88-4169-b1f6-098a33b9cc15-kube-api-access-m8dtw\") pod \"manila-api-0\" (UID: \"1acbfd13-8d88-4169-b1f6-098a33b9cc15\") " pod="openstack/manila-api-0" Dec 05 01:42:56 crc kubenswrapper[4759]: I1205 01:42:56.199034 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.171891 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204b6e54-6e90-4898-b6a8-257780287758" path="/var/lib/kubelet/pods/204b6e54-6e90-4898-b6a8-257780287758/volumes" Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.330114 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.330505 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="proxy-httpd" containerID="cri-o://7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e" gracePeriod=30 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.330561 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-central-agent" containerID="cri-o://05370620c000aa88c690b6ef35efbe18b8c2aa65f0860925c75e73bd24eb9936" gracePeriod=30 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.330583 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-notification-agent" containerID="cri-o://bc664683605fb9335216ce108d0e079d867b8ade4ade94f0f1b61df8190d35c9" gracePeriod=30 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.330585 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="sg-core" containerID="cri-o://1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940" gracePeriod=30 Dec 05 01:42:57 crc kubenswrapper[4759]: E1205 01:42:57.642870 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16466e0b_7a83_46aa_b39e_d52ea5c19f86.slice/crio-conmon-1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16466e0b_7a83_46aa_b39e_d52ea5c19f86.slice/crio-conmon-7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16466e0b_7a83_46aa_b39e_d52ea5c19f86.slice/crio-7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.822640 4759 generic.go:334] "Generic (PLEG): container finished" podID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerID="7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e" exitCode=0 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.822985 4759 generic.go:334] "Generic (PLEG): container finished" podID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerID="1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940" exitCode=2 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.823000 4759 generic.go:334] "Generic (PLEG): container finished" podID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerID="05370620c000aa88c690b6ef35efbe18b8c2aa65f0860925c75e73bd24eb9936" exitCode=0 Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.823902 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerDied","Data":"7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e"} Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.823937 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerDied","Data":"1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940"} Dec 05 01:42:57 crc kubenswrapper[4759]: I1205 01:42:57.823951 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerDied","Data":"05370620c000aa88c690b6ef35efbe18b8c2aa65f0860925c75e73bd24eb9936"} Dec 05 01:42:58 crc kubenswrapper[4759]: I1205 01:42:58.890874 4759 generic.go:334] "Generic (PLEG): container finished" podID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerID="bc664683605fb9335216ce108d0e079d867b8ade4ade94f0f1b61df8190d35c9" exitCode=0 Dec 05 01:42:58 crc kubenswrapper[4759]: I1205 01:42:58.890916 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerDied","Data":"bc664683605fb9335216ce108d0e079d867b8ade4ade94f0f1b61df8190d35c9"} Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.127429 4759 scope.go:117] "RemoveContainer" containerID="82e06aa613fa901364096bf245fbd0c813d94ee5f5efe98a482f42dd0ac1ea6d" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.319603 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c6cf8d999-rmpvx" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.384826 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.401090 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.401363 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="dnsmasq-dns" containerID="cri-o://26067cd9ec606d5cee224ef399c9cf32d94ed7ef69e2eee1f341554299e7944f" gracePeriod=10 Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.560456 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.612853 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.613999 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614184 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g6mb\" (UniqueName: \"kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614542 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614603 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614697 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614718 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.614769 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd\") pod \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\" (UID: \"16466e0b-7a83-46aa-b39e-d52ea5c19f86\") " Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.643792 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.644945 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.648287 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts" (OuterVolumeSpecName: "scripts") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.658566 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb" (OuterVolumeSpecName: "kube-api-access-9g6mb") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "kube-api-access-9g6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.678275 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.723264 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g6mb\" (UniqueName: \"kubernetes.io/projected/16466e0b-7a83-46aa-b39e-d52ea5c19f86-kube-api-access-9g6mb\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.723313 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.723326 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.723336 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16466e0b-7a83-46aa-b39e-d52ea5c19f86-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.733298 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.772030 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.826449 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.826478 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.848982 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data" (OuterVolumeSpecName: "config-data") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: W1205 01:42:59.852102 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1acbfd13_8d88_4169_b1f6_098a33b9cc15.slice/crio-c973110ddf1cf0164b1d3d1e220e4094ffa473e79365b7e9a23da9d7c1e74db6 WatchSource:0}: Error finding container c973110ddf1cf0164b1d3d1e220e4094ffa473e79365b7e9a23da9d7c1e74db6: Status 404 returned error can't find the container with id c973110ddf1cf0164b1d3d1e220e4094ffa473e79365b7e9a23da9d7c1e74db6 Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.853980 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16466e0b-7a83-46aa-b39e-d52ea5c19f86" (UID: "16466e0b-7a83-46aa-b39e-d52ea5c19f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.862633 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.902750 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1acbfd13-8d88-4169-b1f6-098a33b9cc15","Type":"ContainerStarted","Data":"c973110ddf1cf0164b1d3d1e220e4094ffa473e79365b7e9a23da9d7c1e74db6"} Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.906823 4759 generic.go:334] "Generic (PLEG): container finished" podID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerID="26067cd9ec606d5cee224ef399c9cf32d94ed7ef69e2eee1f341554299e7944f" exitCode=0 Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.906912 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" event={"ID":"9542f289-2a5b-4593-8cf5-d43690c6440e","Type":"ContainerDied","Data":"26067cd9ec606d5cee224ef399c9cf32d94ed7ef69e2eee1f341554299e7944f"} Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.909552 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16466e0b-7a83-46aa-b39e-d52ea5c19f86","Type":"ContainerDied","Data":"ce9e40465373db2bd43b35c82ca42e76ce4a834a9827342f1daa370a62a6c401"} Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.909590 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.909625 4759 scope.go:117] "RemoveContainer" containerID="7584ca3c88abe9328cc0aa6892b78b37a707fa617cfdbb0dd6c7dfa72510591e" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.928765 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.928796 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16466e0b-7a83-46aa-b39e-d52ea5c19f86-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.969771 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.973227 4759 scope.go:117] "RemoveContainer" containerID="1cb39de1c193df0cbe47099a1570e12bda587e09381cf57f34821395efedb940" Dec 05 01:42:59 crc kubenswrapper[4759]: I1205 01:42:59.994202 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028021 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:00 crc kubenswrapper[4759]: E1205 01:43:00.028508 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-notification-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028523 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-notification-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: E1205 01:43:00.028540 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="sg-core" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028546 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="sg-core" Dec 05 01:43:00 crc kubenswrapper[4759]: E1205 01:43:00.028580 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-central-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028588 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-central-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: E1205 01:43:00.028607 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="proxy-httpd" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028613 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="proxy-httpd" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028839 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-notification-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028866 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="ceilometer-central-agent" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028879 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="proxy-httpd" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.028891 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" containerName="sg-core" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.029517 4759 scope.go:117] "RemoveContainer" containerID="bc664683605fb9335216ce108d0e079d867b8ade4ade94f0f1b61df8190d35c9" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.030973 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.037324 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.037534 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.037631 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.041648 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.075738 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.133939 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134081 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134102 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134192 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134219 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134242 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134272 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.134295 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjhj\" (UniqueName: \"kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.193066 4759 scope.go:117] "RemoveContainer" containerID="05370620c000aa88c690b6ef35efbe18b8c2aa65f0860925c75e73bd24eb9936" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237219 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237288 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237544 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237623 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237670 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237703 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237736 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjhj\" (UniqueName: \"kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.237819 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.241357 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.242755 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.252070 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.261236 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.261681 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.267012 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.270855 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.273721 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjhj\" (UniqueName: \"kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj\") pod \"ceilometer-0\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " pod="openstack/ceilometer-0" Dec 05 01:43:00 crc kubenswrapper[4759]: I1205 01:43:00.458684 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:01 crc kubenswrapper[4759]: I1205 01:43:01.175250 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16466e0b-7a83-46aa-b39e-d52ea5c19f86" path="/var/lib/kubelet/pods/16466e0b-7a83-46aa-b39e-d52ea5c19f86/volumes" Dec 05 01:43:01 crc kubenswrapper[4759]: I1205 01:43:01.247343 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.250:5353: connect: connection refused" Dec 05 01:43:01 crc kubenswrapper[4759]: I1205 01:43:01.973959 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerStarted","Data":"55ffb08fd46b8c65cb727513b1239d611d0d1888bd51dd904489f7a4529562bf"} Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.049458 4759 generic.go:334] "Generic (PLEG): container finished" podID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerID="6f520c4e40e02037b993080740d980a9e2f7226d3295785deb93f4251767b602" exitCode=137 Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.049503 4759 generic.go:334] "Generic (PLEG): container finished" podID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerID="205bce0544bc92bfe30c54987cdc66e7c0fa7d8cd956564468042da00eded278" exitCode=137 Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.049595 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerDied","Data":"6f520c4e40e02037b993080740d980a9e2f7226d3295785deb93f4251767b602"} Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.049624 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerDied","Data":"205bce0544bc92bfe30c54987cdc66e7c0fa7d8cd956564468042da00eded278"} Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.093812 4759 generic.go:334] "Generic (PLEG): container finished" podID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerID="ed7d073e56a2beccb4ba930b74418c9e723c0649cc262296a634b84683583cb5" exitCode=137 Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.093846 4759 generic.go:334] "Generic (PLEG): container finished" podID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerID="077098bfcaabc3263eb49605a4bc5d96435edaee752a375d4c414f28395e4b72" exitCode=137 Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.093865 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerDied","Data":"ed7d073e56a2beccb4ba930b74418c9e723c0649cc262296a634b84683583cb5"} Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.093890 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerDied","Data":"077098bfcaabc3263eb49605a4bc5d96435edaee752a375d4c414f28395e4b72"} Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.135047 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346123 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346203 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346258 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346441 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346466 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8fw4\" (UniqueName: \"kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346693 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.346719 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.382671 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4" (OuterVolumeSpecName: "kube-api-access-c8fw4") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "kube-api-access-c8fw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.385033 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.430037 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.450291 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts\") pod \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.451653 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmzzm\" (UniqueName: \"kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm\") pod \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.451798 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data\") pod \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.451860 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key\") pod \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.451949 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs\") pod \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\" (UID: \"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.452956 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8fw4\" (UniqueName: \"kubernetes.io/projected/9542f289-2a5b-4593-8cf5-d43690c6440e-kube-api-access-c8fw4\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.456523 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs" (OuterVolumeSpecName: "logs") pod "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" (UID: "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.470282 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm" (OuterVolumeSpecName: "kube-api-access-kmzzm") pod "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" (UID: "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b"). InnerVolumeSpecName "kube-api-access-kmzzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.499021 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" (UID: "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.530095 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6dc499994d-vp27z" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.547245 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.553869 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.554261 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") pod \"9542f289-2a5b-4593-8cf5-d43690c6440e\" (UID: \"9542f289-2a5b-4593-8cf5-d43690c6440e\") " Dec 05 01:43:02 crc kubenswrapper[4759]: W1205 01:43:02.554664 4759 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9542f289-2a5b-4593-8cf5-d43690c6440e/volumes/kubernetes.io~configmap/dns-svc Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.554701 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.555231 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.555250 4759 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.555259 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.555268 4759 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.555276 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmzzm\" (UniqueName: \"kubernetes.io/projected/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-kube-api-access-kmzzm\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.559186 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data" (OuterVolumeSpecName: "config-data") pod "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" (UID: "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.568969 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.576870 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.584000 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.621760 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.630602 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts" (OuterVolumeSpecName: "scripts") pod "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" (UID: "4a2e0ae6-c57c-4c91-a4eb-89619f25a19b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.658005 4759 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.658035 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.658046 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.658055 4759 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.658063 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.677578 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.677980 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config" (OuterVolumeSpecName: "config") pod "9542f289-2a5b-4593-8cf5-d43690c6440e" (UID: "9542f289-2a5b-4593-8cf5-d43690c6440e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.760628 4759 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9542f289-2a5b-4593-8cf5-d43690c6440e-config\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.850031 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.861680 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4zsp\" (UniqueName: \"kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp\") pod \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.861757 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data\") pod \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.861789 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs\") pod \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.861837 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts\") pod \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.861919 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key\") pod \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\" (UID: \"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f\") " Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.865192 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs" (OuterVolumeSpecName: "logs") pod "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" (UID: "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.887800 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp" (OuterVolumeSpecName: "kube-api-access-r4zsp") pod "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" (UID: "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f"). InnerVolumeSpecName "kube-api-access-r4zsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.900542 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" (UID: "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.952124 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data" (OuterVolumeSpecName: "config-data") pod "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" (UID: "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.965466 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4zsp\" (UniqueName: \"kubernetes.io/projected/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-kube-api-access-r4zsp\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.965492 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.965500 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.965509 4759 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:02 crc kubenswrapper[4759]: I1205 01:43:02.970611 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts" (OuterVolumeSpecName: "scripts") pod "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" (UID: "d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.106549 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.127044 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerStarted","Data":"6f207283f8044454d09b38b3cf5a7d369fd5d7950606767d64aa685fc1ac3ca6"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.131597 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerStarted","Data":"a606d3130e7bf5094744aa7e4b5939d16137111dccda78aca3e331d6addb97a6"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.135383 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f597fc49-47d87" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.135409 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f597fc49-47d87" event={"ID":"4a2e0ae6-c57c-4c91-a4eb-89619f25a19b","Type":"ContainerDied","Data":"af7c0fb355897ef7f5b62670350811776bf74fec9753245eff972a2df0c58f84"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.135462 4759 scope.go:117] "RemoveContainer" containerID="6f520c4e40e02037b993080740d980a9e2f7226d3295785deb93f4251767b602" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.152237 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8b95d7c69-zrnrb" event={"ID":"d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f","Type":"ContainerDied","Data":"6408c9a0b46b1cc6985a3c0bc51e3ff086667ba7c9fb31336494824f79bc5cd0"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.152423 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8b95d7c69-zrnrb" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.163180 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.890008518 podStartE2EDuration="15.163153629s" podCreationTimestamp="2025-12-05 01:42:48 +0000 UTC" firstStartedPulling="2025-12-05 01:42:49.917657082 +0000 UTC m=+4789.133318032" lastFinishedPulling="2025-12-05 01:42:59.190802193 +0000 UTC m=+4798.406463143" observedRunningTime="2025-12-05 01:43:03.152478701 +0000 UTC m=+4802.368139651" watchObservedRunningTime="2025-12-05 01:43:03.163153629 +0000 UTC m=+4802.378814579" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.174949 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1acbfd13-8d88-4169-b1f6-098a33b9cc15","Type":"ContainerStarted","Data":"5c17211873278b73c113af569b1f59859dbe5ddc6d6eac931831845159f352d6"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.175947 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon-log" containerID="cri-o://4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57" gracePeriod=30 Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.176277 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.176358 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd9576ff-cgflc" event={"ID":"9542f289-2a5b-4593-8cf5-d43690c6440e","Type":"ContainerDied","Data":"763d8c1878c0f4c01c451ce9b002dd29b518ee827bc4274fb1b42df31e1b833d"} Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.176410 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" containerID="cri-o://290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb" gracePeriod=30 Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.214535 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.232952 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66f597fc49-47d87"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.290663 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.321035 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dd9576ff-cgflc"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.338474 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.352981 4759 scope.go:117] "RemoveContainer" containerID="205bce0544bc92bfe30c54987cdc66e7c0fa7d8cd956564468042da00eded278" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.358561 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8b95d7c69-zrnrb"] Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.380470 4759 scope.go:117] "RemoveContainer" containerID="ed7d073e56a2beccb4ba930b74418c9e723c0649cc262296a634b84683583cb5" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.576196 4759 scope.go:117] "RemoveContainer" containerID="077098bfcaabc3263eb49605a4bc5d96435edaee752a375d4c414f28395e4b72" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.621501 4759 scope.go:117] "RemoveContainer" containerID="26067cd9ec606d5cee224ef399c9cf32d94ed7ef69e2eee1f341554299e7944f" Dec 05 01:43:03 crc kubenswrapper[4759]: I1205 01:43:03.647152 4759 scope.go:117] "RemoveContainer" containerID="6d9b7cb37dcdd232a7bcf4c16615757e738bb0dddf7f7faf35be90aeaea8e838" Dec 05 01:43:04 crc kubenswrapper[4759]: I1205 01:43:04.229294 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1acbfd13-8d88-4169-b1f6-098a33b9cc15","Type":"ContainerStarted","Data":"d01b191c78c31c8f9779c27324b8db5a61615bb67a6b404e04fd3de0e74c2718"} Dec 05 01:43:04 crc kubenswrapper[4759]: I1205 01:43:04.230430 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 05 01:43:04 crc kubenswrapper[4759]: I1205 01:43:04.248300 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerStarted","Data":"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334"} Dec 05 01:43:04 crc kubenswrapper[4759]: I1205 01:43:04.276071 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=9.276048738 podStartE2EDuration="9.276048738s" podCreationTimestamp="2025-12-05 01:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:43:04.257389507 +0000 UTC m=+4803.473050457" watchObservedRunningTime="2025-12-05 01:43:04.276048738 +0000 UTC m=+4803.491709688" Dec 05 01:43:05 crc kubenswrapper[4759]: I1205 01:43:05.169611 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" path="/var/lib/kubelet/pods/4a2e0ae6-c57c-4c91-a4eb-89619f25a19b/volumes" Dec 05 01:43:05 crc kubenswrapper[4759]: I1205 01:43:05.170798 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" path="/var/lib/kubelet/pods/9542f289-2a5b-4593-8cf5-d43690c6440e/volumes" Dec 05 01:43:05 crc kubenswrapper[4759]: I1205 01:43:05.171784 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" path="/var/lib/kubelet/pods/d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f/volumes" Dec 05 01:43:05 crc kubenswrapper[4759]: I1205 01:43:05.808884 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:06 crc kubenswrapper[4759]: I1205 01:43:06.274384 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerStarted","Data":"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc"} Dec 05 01:43:06 crc kubenswrapper[4759]: I1205 01:43:06.274429 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerStarted","Data":"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69"} Dec 05 01:43:07 crc kubenswrapper[4759]: I1205 01:43:07.139460 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.65:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.65:8443: connect: connection refused" Dec 05 01:43:07 crc kubenswrapper[4759]: I1205 01:43:07.295627 4759 generic.go:334] "Generic (PLEG): container finished" podID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerID="290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb" exitCode=0 Dec 05 01:43:07 crc kubenswrapper[4759]: I1205 01:43:07.295667 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerDied","Data":"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb"} Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.360686 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerStarted","Data":"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468"} Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.360923 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-central-agent" containerID="cri-o://4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334" gracePeriod=30 Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.361046 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="proxy-httpd" containerID="cri-o://c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468" gracePeriod=30 Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.361065 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="sg-core" containerID="cri-o://aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc" gracePeriod=30 Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.361084 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-notification-agent" containerID="cri-o://5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69" gracePeriod=30 Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.361427 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 01:43:08 crc kubenswrapper[4759]: I1205 01:43:08.419465 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.456454645 podStartE2EDuration="9.4194477s" podCreationTimestamp="2025-12-05 01:42:59 +0000 UTC" firstStartedPulling="2025-12-05 01:43:02.618494448 +0000 UTC m=+4801.834155398" lastFinishedPulling="2025-12-05 01:43:07.581487503 +0000 UTC m=+4806.797148453" observedRunningTime="2025-12-05 01:43:08.413731513 +0000 UTC m=+4807.629392473" watchObservedRunningTime="2025-12-05 01:43:08.4194477 +0000 UTC m=+4807.635108640" Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.084240 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374242 4759 generic.go:334] "Generic (PLEG): container finished" podID="5ea4288a-d1d7-4434-9f4e-940442905231" containerID="c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468" exitCode=0 Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374288 4759 generic.go:334] "Generic (PLEG): container finished" podID="5ea4288a-d1d7-4434-9f4e-940442905231" containerID="aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc" exitCode=2 Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374300 4759 generic.go:334] "Generic (PLEG): container finished" podID="5ea4288a-d1d7-4434-9f4e-940442905231" containerID="5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69" exitCode=0 Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374344 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerDied","Data":"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468"} Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374407 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerDied","Data":"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc"} Dec 05 01:43:09 crc kubenswrapper[4759]: I1205 01:43:09.374426 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerDied","Data":"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69"} Dec 05 01:43:11 crc kubenswrapper[4759]: I1205 01:43:11.137618 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 05 01:43:11 crc kubenswrapper[4759]: I1205 01:43:11.210389 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:11 crc kubenswrapper[4759]: I1205 01:43:11.403266 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="manila-scheduler" containerID="cri-o://9fbd8745049c803f7cc194f3ddf198bfd5b2f8ea4d0a2bf3ee1a32bedee28bcf" gracePeriod=30 Dec 05 01:43:11 crc kubenswrapper[4759]: I1205 01:43:11.403332 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="probe" containerID="cri-o://27556903511d415ef02ef48e40e143d700e345be456472317f6eb7c6b608210e" gracePeriod=30 Dec 05 01:43:12 crc kubenswrapper[4759]: I1205 01:43:12.420448 4759 generic.go:334] "Generic (PLEG): container finished" podID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerID="27556903511d415ef02ef48e40e143d700e345be456472317f6eb7c6b608210e" exitCode=0 Dec 05 01:43:12 crc kubenswrapper[4759]: I1205 01:43:12.420546 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerDied","Data":"27556903511d415ef02ef48e40e143d700e345be456472317f6eb7c6b608210e"} Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.056966 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136424 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136468 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136494 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136519 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136580 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136603 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136632 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjhj\" (UniqueName: \"kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.136711 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts\") pod \"5ea4288a-d1d7-4434-9f4e-940442905231\" (UID: \"5ea4288a-d1d7-4434-9f4e-940442905231\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.138050 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.138504 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.143853 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts" (OuterVolumeSpecName: "scripts") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.144650 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj" (OuterVolumeSpecName: "kube-api-access-jcjhj") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "kube-api-access-jcjhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.195833 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.226383 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241327 4759 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241358 4759 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ea4288a-d1d7-4434-9f4e-940442905231-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241367 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjhj\" (UniqueName: \"kubernetes.io/projected/5ea4288a-d1d7-4434-9f4e-940442905231-kube-api-access-jcjhj\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241378 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241386 4759 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.241395 4759 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.259750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.305448 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data" (OuterVolumeSpecName: "config-data") pod "5ea4288a-d1d7-4434-9f4e-940442905231" (UID: "5ea4288a-d1d7-4434-9f4e-940442905231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.342855 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.342886 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4288a-d1d7-4434-9f4e-940442905231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.447544 4759 generic.go:334] "Generic (PLEG): container finished" podID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerID="9fbd8745049c803f7cc194f3ddf198bfd5b2f8ea4d0a2bf3ee1a32bedee28bcf" exitCode=0 Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.447614 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerDied","Data":"9fbd8745049c803f7cc194f3ddf198bfd5b2f8ea4d0a2bf3ee1a32bedee28bcf"} Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.451206 4759 generic.go:334] "Generic (PLEG): container finished" podID="5ea4288a-d1d7-4434-9f4e-940442905231" containerID="4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334" exitCode=0 Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.451250 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerDied","Data":"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334"} Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.451284 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ea4288a-d1d7-4434-9f4e-940442905231","Type":"ContainerDied","Data":"a606d3130e7bf5094744aa7e4b5939d16137111dccda78aca3e331d6addb97a6"} Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.451350 4759 scope.go:117] "RemoveContainer" containerID="c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.451554 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.483890 4759 scope.go:117] "RemoveContainer" containerID="aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.503731 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.517610 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.532461 4759 scope.go:117] "RemoveContainer" containerID="5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.558440 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.558929 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.558952 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.558974 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.558981 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.558988 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-central-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.558994 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-central-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559014 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="proxy-httpd" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559020 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="proxy-httpd" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559029 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="init" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559035 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="init" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559052 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="dnsmasq-dns" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559058 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="dnsmasq-dns" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559065 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="sg-core" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559070 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="sg-core" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559080 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-notification-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559088 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-notification-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559103 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559109 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.559125 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559131 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559358 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-central-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559379 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559389 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9542f289-2a5b-4593-8cf5-d43690c6440e" containerName="dnsmasq-dns" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559405 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="proxy-httpd" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559411 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon-log" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559421 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="ceilometer-notification-agent" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559431 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="d059da11-8f4c-4bfe-a9b6-1dcc5bd93a9f" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559440 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" containerName="sg-core" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.559454 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2e0ae6-c57c-4c91-a4eb-89619f25a19b" containerName="horizon" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.561395 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.566063 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.566179 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.566293 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.572324 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.583531 4759 scope.go:117] "RemoveContainer" containerID="4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.614426 4759 scope.go:117] "RemoveContainer" containerID="c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.615210 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468\": container with ID starting with c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468 not found: ID does not exist" containerID="c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.615249 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468"} err="failed to get container status \"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468\": rpc error: code = NotFound desc = could not find container \"c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468\": container with ID starting with c0c481cec510cf699b08b571c968afd86644c9a5b7690fc7b7cb1810fcb50468 not found: ID does not exist" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.615273 4759 scope.go:117] "RemoveContainer" containerID="aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.616170 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc\": container with ID starting with aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc not found: ID does not exist" containerID="aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.616221 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc"} err="failed to get container status \"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc\": rpc error: code = NotFound desc = could not find container \"aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc\": container with ID starting with aa5bedf53cdac0f74543274d303d68b57d8437f349091833aad494b922d89bcc not found: ID does not exist" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.616253 4759 scope.go:117] "RemoveContainer" containerID="5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.616595 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69\": container with ID starting with 5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69 not found: ID does not exist" containerID="5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.616627 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69"} err="failed to get container status \"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69\": rpc error: code = NotFound desc = could not find container \"5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69\": container with ID starting with 5874e945b20dcdda8a94afae847befe177f7126c7157e1fa31158a0bae6cff69 not found: ID does not exist" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.616646 4759 scope.go:117] "RemoveContainer" containerID="4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334" Dec 05 01:43:13 crc kubenswrapper[4759]: E1205 01:43:13.616894 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334\": container with ID starting with 4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334 not found: ID does not exist" containerID="4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.616918 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334"} err="failed to get container status \"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334\": rpc error: code = NotFound desc = could not find container \"4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334\": container with ID starting with 4b6fa2803c37178eea63508d031689eb825656a467463e903604276dfdb0e334 not found: ID does not exist" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.647883 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-config-data\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.647939 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.647963 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-run-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.648087 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.648184 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-log-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.648272 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6jdt\" (UniqueName: \"kubernetes.io/projected/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-kube-api-access-n6jdt\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.648327 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.648418 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-scripts\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.677866 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750336 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750499 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750531 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750558 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750696 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.750767 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vvw8\" (UniqueName: \"kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8\") pod \"c618cb90-2d43-4445-9955-e7de3bf03e41\" (UID: \"c618cb90-2d43-4445-9955-e7de3bf03e41\") " Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751019 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751295 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6jdt\" (UniqueName: \"kubernetes.io/projected/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-kube-api-access-n6jdt\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751363 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751485 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-scripts\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751534 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-config-data\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751575 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751597 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-run-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751717 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751772 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-log-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.751843 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c618cb90-2d43-4445-9955-e7de3bf03e41-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.752272 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-log-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.753828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-run-httpd\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.756791 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts" (OuterVolumeSpecName: "scripts") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.757745 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.758539 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8" (OuterVolumeSpecName: "kube-api-access-8vvw8") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "kube-api-access-8vvw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.759837 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.759915 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.760499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-scripts\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.761598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.763340 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-config-data\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.771526 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6jdt\" (UniqueName: \"kubernetes.io/projected/507bfd65-c768-4dfb-9e1c-aed7bdf0ef55-kube-api-access-n6jdt\") pod \"ceilometer-0\" (UID: \"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55\") " pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.830611 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.854279 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.854335 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vvw8\" (UniqueName: \"kubernetes.io/projected/c618cb90-2d43-4445-9955-e7de3bf03e41-kube-api-access-8vvw8\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.854351 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.854363 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.892254 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data" (OuterVolumeSpecName: "config-data") pod "c618cb90-2d43-4445-9955-e7de3bf03e41" (UID: "c618cb90-2d43-4445-9955-e7de3bf03e41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.892573 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 01:43:13 crc kubenswrapper[4759]: I1205 01:43:13.956258 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c618cb90-2d43-4445-9955-e7de3bf03e41-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:14 crc kubenswrapper[4759]: W1205 01:43:14.412543 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod507bfd65_c768_4dfb_9e1c_aed7bdf0ef55.slice/crio-ca7edd5c26d46c6a5ce8c66a946ffbd6360e416e8726536e95d8265774eeafc6 WatchSource:0}: Error finding container ca7edd5c26d46c6a5ce8c66a946ffbd6360e416e8726536e95d8265774eeafc6: Status 404 returned error can't find the container with id ca7edd5c26d46c6a5ce8c66a946ffbd6360e416e8726536e95d8265774eeafc6 Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.420850 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.462493 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55","Type":"ContainerStarted","Data":"ca7edd5c26d46c6a5ce8c66a946ffbd6360e416e8726536e95d8265774eeafc6"} Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.473786 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c618cb90-2d43-4445-9955-e7de3bf03e41","Type":"ContainerDied","Data":"dcb604eb0593c58dc324a30b3ded7bd930ace991217a61bdca6aafc75a578098"} Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.473850 4759 scope.go:117] "RemoveContainer" containerID="27556903511d415ef02ef48e40e143d700e345be456472317f6eb7c6b608210e" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.474010 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.512060 4759 scope.go:117] "RemoveContainer" containerID="9fbd8745049c803f7cc194f3ddf198bfd5b2f8ea4d0a2bf3ee1a32bedee28bcf" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.530836 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.546961 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.559695 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:14 crc kubenswrapper[4759]: E1205 01:43:14.560374 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="probe" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.560399 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="probe" Dec 05 01:43:14 crc kubenswrapper[4759]: E1205 01:43:14.560430 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="manila-scheduler" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.560439 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="manila-scheduler" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.560756 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="probe" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.560802 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" containerName="manila-scheduler" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.562406 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.565602 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.569884 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.578813 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.578883 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h744n\" (UniqueName: \"kubernetes.io/projected/20807b16-d503-447f-84ca-43f49c001c0c-kube-api-access-h744n\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.578923 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20807b16-d503-447f-84ca-43f49c001c0c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.578949 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.578978 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.579038 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-scripts\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680670 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680721 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h744n\" (UniqueName: \"kubernetes.io/projected/20807b16-d503-447f-84ca-43f49c001c0c-kube-api-access-h744n\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680749 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20807b16-d503-447f-84ca-43f49c001c0c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680767 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680786 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.680823 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-scripts\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.681885 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20807b16-d503-447f-84ca-43f49c001c0c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.685522 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.686872 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-scripts\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.687145 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.687468 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20807b16-d503-447f-84ca-43f49c001c0c-config-data\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.699897 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h744n\" (UniqueName: \"kubernetes.io/projected/20807b16-d503-447f-84ca-43f49c001c0c-kube-api-access-h744n\") pod \"manila-scheduler-0\" (UID: \"20807b16-d503-447f-84ca-43f49c001c0c\") " pod="openstack/manila-scheduler-0" Dec 05 01:43:14 crc kubenswrapper[4759]: I1205 01:43:14.882169 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 05 01:43:15 crc kubenswrapper[4759]: I1205 01:43:15.170563 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea4288a-d1d7-4434-9f4e-940442905231" path="/var/lib/kubelet/pods/5ea4288a-d1d7-4434-9f4e-940442905231/volumes" Dec 05 01:43:15 crc kubenswrapper[4759]: I1205 01:43:15.171777 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c618cb90-2d43-4445-9955-e7de3bf03e41" path="/var/lib/kubelet/pods/c618cb90-2d43-4445-9955-e7de3bf03e41/volumes" Dec 05 01:43:15 crc kubenswrapper[4759]: I1205 01:43:15.407382 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 05 01:43:15 crc kubenswrapper[4759]: W1205 01:43:15.412220 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20807b16_d503_447f_84ca_43f49c001c0c.slice/crio-485048c575124d5dd9da081a6ef4b4e5121965a4885f5847b3becf9972f38b16 WatchSource:0}: Error finding container 485048c575124d5dd9da081a6ef4b4e5121965a4885f5847b3becf9972f38b16: Status 404 returned error can't find the container with id 485048c575124d5dd9da081a6ef4b4e5121965a4885f5847b3becf9972f38b16 Dec 05 01:43:15 crc kubenswrapper[4759]: I1205 01:43:15.511528 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55","Type":"ContainerStarted","Data":"fb61107f5aa236f83bed5f1063b86a446599a30e96a60b9bdd3b45912e3ad06b"} Dec 05 01:43:15 crc kubenswrapper[4759]: I1205 01:43:15.569339 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"20807b16-d503-447f-84ca-43f49c001c0c","Type":"ContainerStarted","Data":"485048c575124d5dd9da081a6ef4b4e5121965a4885f5847b3becf9972f38b16"} Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.139840 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.65:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.65:8443: connect: connection refused" Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.606972 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"20807b16-d503-447f-84ca-43f49c001c0c","Type":"ContainerStarted","Data":"17bb5c0e9a4ff9cc67297ce07160fef7c89fb9c0445aec8cc1962ac3a88c0780"} Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.607210 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"20807b16-d503-447f-84ca-43f49c001c0c","Type":"ContainerStarted","Data":"459fcbde4d98706beea9ed94017baff1a92ec2067d1b70093d878dd180fd420a"} Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.611103 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55","Type":"ContainerStarted","Data":"e686533a9dfeefe8d910c3a6f9565f28c060e60ba1ad4121ebdc1ab758975b5d"} Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.632995 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.632976334 podStartE2EDuration="3.632976334s" podCreationTimestamp="2025-12-05 01:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:43:17.627593474 +0000 UTC m=+4816.843254424" watchObservedRunningTime="2025-12-05 01:43:17.632976334 +0000 UTC m=+4816.848637284" Dec 05 01:43:17 crc kubenswrapper[4759]: I1205 01:43:17.895620 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 05 01:43:19 crc kubenswrapper[4759]: I1205 01:43:19.635751 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55","Type":"ContainerStarted","Data":"8afcd6232ecac84491d51ee268bf293dafee409be7df360731f218a94e93dda2"} Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.601643 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.651534 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"507bfd65-c768-4dfb-9e1c-aed7bdf0ef55","Type":"ContainerStarted","Data":"feb019f975b9f09d602a4b3606a94fab2c775003c4c27c5b0df51cf6e73faebd"} Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.651792 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.693294 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.693582 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="manila-share" containerID="cri-o://55ffb08fd46b8c65cb727513b1239d611d0d1888bd51dd904489f7a4529562bf" gracePeriod=30 Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.694130 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="probe" containerID="cri-o://6f207283f8044454d09b38b3cf5a7d369fd5d7950606767d64aa685fc1ac3ca6" gracePeriod=30 Dec 05 01:43:20 crc kubenswrapper[4759]: I1205 01:43:20.694162 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.023739045 podStartE2EDuration="7.694144628s" podCreationTimestamp="2025-12-05 01:43:13 +0000 UTC" firstStartedPulling="2025-12-05 01:43:14.416636608 +0000 UTC m=+4813.632297558" lastFinishedPulling="2025-12-05 01:43:20.087042191 +0000 UTC m=+4819.302703141" observedRunningTime="2025-12-05 01:43:20.678112311 +0000 UTC m=+4819.893773271" watchObservedRunningTime="2025-12-05 01:43:20.694144628 +0000 UTC m=+4819.909805578" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.670124 4759 generic.go:334] "Generic (PLEG): container finished" podID="eac762df-54b7-43a3-bd65-a6fadb802446" containerID="6f207283f8044454d09b38b3cf5a7d369fd5d7950606767d64aa685fc1ac3ca6" exitCode=0 Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.670551 4759 generic.go:334] "Generic (PLEG): container finished" podID="eac762df-54b7-43a3-bd65-a6fadb802446" containerID="55ffb08fd46b8c65cb727513b1239d611d0d1888bd51dd904489f7a4529562bf" exitCode=1 Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.670367 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerDied","Data":"6f207283f8044454d09b38b3cf5a7d369fd5d7950606767d64aa685fc1ac3ca6"} Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.672353 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerDied","Data":"55ffb08fd46b8c65cb727513b1239d611d0d1888bd51dd904489f7a4529562bf"} Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.855661 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.900916 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901380 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901493 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55q64\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901597 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901659 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901709 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901863 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts\") pod \"eac762df-54b7-43a3-bd65-a6fadb802446\" (UID: \"eac762df-54b7-43a3-bd65-a6fadb802446\") " Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901064 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.901772 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.903226 4759 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.903250 4759 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eac762df-54b7-43a3-bd65-a6fadb802446-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.908543 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph" (OuterVolumeSpecName: "ceph") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.911120 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.914655 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64" (OuterVolumeSpecName: "kube-api-access-55q64") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "kube-api-access-55q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.922940 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts" (OuterVolumeSpecName: "scripts") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:21 crc kubenswrapper[4759]: I1205 01:43:21.985227 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.005718 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55q64\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-kube-api-access-55q64\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.005748 4759 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.005759 4759 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eac762df-54b7-43a3-bd65-a6fadb802446-ceph\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.005768 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.005780 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.037953 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data" (OuterVolumeSpecName: "config-data") pod "eac762df-54b7-43a3-bd65-a6fadb802446" (UID: "eac762df-54b7-43a3-bd65-a6fadb802446"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.108751 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac762df-54b7-43a3-bd65-a6fadb802446-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.684240 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"eac762df-54b7-43a3-bd65-a6fadb802446","Type":"ContainerDied","Data":"3b6d0da6b6587c9c03409a5ecdd26c166cb716e9317c8931754812dd03b568d1"} Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.684298 4759 scope.go:117] "RemoveContainer" containerID="6f207283f8044454d09b38b3cf5a7d369fd5d7950606767d64aa685fc1ac3ca6" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.684393 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.727278 4759 scope.go:117] "RemoveContainer" containerID="55ffb08fd46b8c65cb727513b1239d611d0d1888bd51dd904489f7a4529562bf" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.752025 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.774571 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.793952 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:22 crc kubenswrapper[4759]: E1205 01:43:22.794680 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="manila-share" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.794713 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="manila-share" Dec 05 01:43:22 crc kubenswrapper[4759]: E1205 01:43:22.794791 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="probe" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.794805 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="probe" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.795205 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="manila-share" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.795253 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" containerName="probe" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.797472 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.800773 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.812298 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828239 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-scripts\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828480 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828525 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828619 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-ceph\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828655 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtm4\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-kube-api-access-ldtm4\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828691 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828823 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.828865 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931430 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931488 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931586 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-scripts\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931685 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931708 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931764 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-ceph\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931789 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtm4\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-kube-api-access-ldtm4\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.931810 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.932192 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.932258 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.935295 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.936421 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.937011 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-config-data\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.938030 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-ceph\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.940693 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-scripts\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:22 crc kubenswrapper[4759]: I1205 01:43:22.957073 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtm4\" (UniqueName: \"kubernetes.io/projected/a8b83f4d-0c22-48a4-b589-109ff6a5e8e2-kube-api-access-ldtm4\") pod \"manila-share-share1-0\" (UID: \"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2\") " pod="openstack/manila-share-share1-0" Dec 05 01:43:23 crc kubenswrapper[4759]: I1205 01:43:23.150072 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 05 01:43:23 crc kubenswrapper[4759]: I1205 01:43:23.177806 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac762df-54b7-43a3-bd65-a6fadb802446" path="/var/lib/kubelet/pods/eac762df-54b7-43a3-bd65-a6fadb802446/volumes" Dec 05 01:43:23 crc kubenswrapper[4759]: I1205 01:43:23.841138 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 05 01:43:23 crc kubenswrapper[4759]: W1205 01:43:23.843157 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b83f4d_0c22_48a4_b589_109ff6a5e8e2.slice/crio-a7f721ed02a5cab11b61db1e53720eb0bf19b4410f91890bf9dbfc3a6316fdf2 WatchSource:0}: Error finding container a7f721ed02a5cab11b61db1e53720eb0bf19b4410f91890bf9dbfc3a6316fdf2: Status 404 returned error can't find the container with id a7f721ed02a5cab11b61db1e53720eb0bf19b4410f91890bf9dbfc3a6316fdf2 Dec 05 01:43:24 crc kubenswrapper[4759]: I1205 01:43:24.708526 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2","Type":"ContainerStarted","Data":"6a01124d88d498847de8793438780cd22e1d4ebcdd883c5e35af0f301e70baed"} Dec 05 01:43:24 crc kubenswrapper[4759]: I1205 01:43:24.709126 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2","Type":"ContainerStarted","Data":"a7f721ed02a5cab11b61db1e53720eb0bf19b4410f91890bf9dbfc3a6316fdf2"} Dec 05 01:43:24 crc kubenswrapper[4759]: I1205 01:43:24.882361 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 05 01:43:25 crc kubenswrapper[4759]: I1205 01:43:25.724362 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a8b83f4d-0c22-48a4-b589-109ff6a5e8e2","Type":"ContainerStarted","Data":"35a796d8d21c3d6305e9f86b078f6f830e5934505e33dd89fd44e48737e5c4cd"} Dec 05 01:43:25 crc kubenswrapper[4759]: I1205 01:43:25.755398 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.755370573 podStartE2EDuration="3.755370573s" podCreationTimestamp="2025-12-05 01:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 01:43:25.746449658 +0000 UTC m=+4824.962110618" watchObservedRunningTime="2025-12-05 01:43:25.755370573 +0000 UTC m=+4824.971031563" Dec 05 01:43:27 crc kubenswrapper[4759]: I1205 01:43:27.147075 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-759bc69744-82t58" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.65:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.65:8443: connect: connection refused" Dec 05 01:43:27 crc kubenswrapper[4759]: I1205 01:43:27.147519 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.151513 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.711976 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815263 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815412 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4bt\" (UniqueName: \"kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815491 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815676 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815765 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815815 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.815907 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key\") pod \"9978bcbc-9b12-401b-b73e-7aeb17587928\" (UID: \"9978bcbc-9b12-401b-b73e-7aeb17587928\") " Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.816143 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs" (OuterVolumeSpecName: "logs") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.816605 4759 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9978bcbc-9b12-401b-b73e-7aeb17587928-logs\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.821301 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.830504 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt" (OuterVolumeSpecName: "kube-api-access-wm4bt") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "kube-api-access-wm4bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.840063 4759 generic.go:334] "Generic (PLEG): container finished" podID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerID="4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57" exitCode=137 Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.840120 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerDied","Data":"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57"} Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.840154 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-759bc69744-82t58" event={"ID":"9978bcbc-9b12-401b-b73e-7aeb17587928","Type":"ContainerDied","Data":"d460350139f3d98f3f210b0411cf5080185deab396814346e1f7e110b938185d"} Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.840174 4759 scope.go:117] "RemoveContainer" containerID="290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.840410 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-759bc69744-82t58" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.848099 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data" (OuterVolumeSpecName: "config-data") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.860594 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.871819 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts" (OuterVolumeSpecName: "scripts") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.920108 4759 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.920145 4759 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.920158 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4bt\" (UniqueName: \"kubernetes.io/projected/9978bcbc-9b12-401b-b73e-7aeb17587928-kube-api-access-wm4bt\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.920175 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.920190 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9978bcbc-9b12-401b-b73e-7aeb17587928-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:33 crc kubenswrapper[4759]: I1205 01:43:33.924519 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "9978bcbc-9b12-401b-b73e-7aeb17587928" (UID: "9978bcbc-9b12-401b-b73e-7aeb17587928"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.022009 4759 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9978bcbc-9b12-401b-b73e-7aeb17587928-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.138861 4759 scope.go:117] "RemoveContainer" containerID="4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.213532 4759 scope.go:117] "RemoveContainer" containerID="290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb" Dec 05 01:43:34 crc kubenswrapper[4759]: E1205 01:43:34.217478 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb\": container with ID starting with 290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb not found: ID does not exist" containerID="290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.217530 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb"} err="failed to get container status \"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb\": rpc error: code = NotFound desc = could not find container \"290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb\": container with ID starting with 290d037e15287345125180fa955dbabb041d745d2545e242f5e2bdd81f6cffcb not found: ID does not exist" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.217556 4759 scope.go:117] "RemoveContainer" containerID="4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57" Dec 05 01:43:34 crc kubenswrapper[4759]: E1205 01:43:34.219796 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57\": container with ID starting with 4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57 not found: ID does not exist" containerID="4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.219839 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57"} err="failed to get container status \"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57\": rpc error: code = NotFound desc = could not find container \"4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57\": container with ID starting with 4b8bf34e9cb0ca5fb762aa2abe972dba6b228258c0f8de4386a43d38c1b1de57 not found: ID does not exist" Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.223370 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:43:34 crc kubenswrapper[4759]: I1205 01:43:34.254846 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-759bc69744-82t58"] Dec 05 01:43:35 crc kubenswrapper[4759]: I1205 01:43:35.182837 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" path="/var/lib/kubelet/pods/9978bcbc-9b12-401b-b73e-7aeb17587928/volumes" Dec 05 01:43:36 crc kubenswrapper[4759]: I1205 01:43:36.517657 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 05 01:43:43 crc kubenswrapper[4759]: I1205 01:43:43.907464 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 01:43:44 crc kubenswrapper[4759]: I1205 01:43:44.722378 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.167236 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78"] Dec 05 01:45:00 crc kubenswrapper[4759]: E1205 01:45:00.168406 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.168425 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" Dec 05 01:45:00 crc kubenswrapper[4759]: E1205 01:45:00.168478 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon-log" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.168488 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon-log" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.168753 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.168781 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9978bcbc-9b12-401b-b73e-7aeb17587928" containerName="horizon-log" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.169802 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.172142 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.173740 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.179122 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78"] Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.304446 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqtz\" (UniqueName: \"kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.304493 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.304532 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.407758 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqtz\" (UniqueName: \"kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.407827 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.407864 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.410367 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.416711 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.439003 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqtz\" (UniqueName: \"kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz\") pod \"collect-profiles-29414985-t7g78\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:00 crc kubenswrapper[4759]: I1205 01:45:00.495446 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:01 crc kubenswrapper[4759]: I1205 01:45:01.007300 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78"] Dec 05 01:45:01 crc kubenswrapper[4759]: W1205 01:45:01.014435 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e95e408_7164_432e_b96c_2ab6bc0d859a.slice/crio-b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d WatchSource:0}: Error finding container b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d: Status 404 returned error can't find the container with id b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d Dec 05 01:45:01 crc kubenswrapper[4759]: E1205 01:45:01.576090 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e95e408_7164_432e_b96c_2ab6bc0d859a.slice/crio-conmon-747a7b77c7446bee1d5eb864b4b855b34f7dde5d790dbb2e750622dff03fc40d.scope\": RecentStats: unable to find data in memory cache]" Dec 05 01:45:02 crc kubenswrapper[4759]: I1205 01:45:02.017540 4759 generic.go:334] "Generic (PLEG): container finished" podID="0e95e408-7164-432e-b96c-2ab6bc0d859a" containerID="747a7b77c7446bee1d5eb864b4b855b34f7dde5d790dbb2e750622dff03fc40d" exitCode=0 Dec 05 01:45:02 crc kubenswrapper[4759]: I1205 01:45:02.017588 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" event={"ID":"0e95e408-7164-432e-b96c-2ab6bc0d859a","Type":"ContainerDied","Data":"747a7b77c7446bee1d5eb864b4b855b34f7dde5d790dbb2e750622dff03fc40d"} Dec 05 01:45:02 crc kubenswrapper[4759]: I1205 01:45:02.017976 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" event={"ID":"0e95e408-7164-432e-b96c-2ab6bc0d859a","Type":"ContainerStarted","Data":"b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d"} Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.510334 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.693023 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqqtz\" (UniqueName: \"kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz\") pod \"0e95e408-7164-432e-b96c-2ab6bc0d859a\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.693472 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume\") pod \"0e95e408-7164-432e-b96c-2ab6bc0d859a\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.693587 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume\") pod \"0e95e408-7164-432e-b96c-2ab6bc0d859a\" (UID: \"0e95e408-7164-432e-b96c-2ab6bc0d859a\") " Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.695363 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e95e408-7164-432e-b96c-2ab6bc0d859a" (UID: "0e95e408-7164-432e-b96c-2ab6bc0d859a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.699908 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz" (OuterVolumeSpecName: "kube-api-access-tqqtz") pod "0e95e408-7164-432e-b96c-2ab6bc0d859a" (UID: "0e95e408-7164-432e-b96c-2ab6bc0d859a"). InnerVolumeSpecName "kube-api-access-tqqtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.710211 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e95e408-7164-432e-b96c-2ab6bc0d859a" (UID: "0e95e408-7164-432e-b96c-2ab6bc0d859a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.796750 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqqtz\" (UniqueName: \"kubernetes.io/projected/0e95e408-7164-432e-b96c-2ab6bc0d859a-kube-api-access-tqqtz\") on node \"crc\" DevicePath \"\"" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.796789 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e95e408-7164-432e-b96c-2ab6bc0d859a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:45:03 crc kubenswrapper[4759]: I1205 01:45:03.796801 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e95e408-7164-432e-b96c-2ab6bc0d859a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.061634 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" event={"ID":"0e95e408-7164-432e-b96c-2ab6bc0d859a","Type":"ContainerDied","Data":"b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d"} Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.061682 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78" Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.061724 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2037c0bb26d10fc714b90f1558cb8ea762f430ac5c87ce7cab79b2afffd7a1d" Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.434972 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.435300 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.604236 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7"] Dec 05 01:45:04 crc kubenswrapper[4759]: I1205 01:45:04.620605 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414940-97xc7"] Dec 05 01:45:05 crc kubenswrapper[4759]: I1205 01:45:05.182802 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1a3bf6-d1bb-407b-b081-7656a0ffaa04" path="/var/lib/kubelet/pods/7a1a3bf6-d1bb-407b-b081-7656a0ffaa04/volumes" Dec 05 01:45:34 crc kubenswrapper[4759]: I1205 01:45:34.434877 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:45:34 crc kubenswrapper[4759]: I1205 01:45:34.435488 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:45:35 crc kubenswrapper[4759]: I1205 01:45:35.302872 4759 scope.go:117] "RemoveContainer" containerID="a693af9082d34c20ecbba90a5cf5dcf55d4c3ad96f64e20a3594a1e39e506122" Dec 05 01:45:55 crc kubenswrapper[4759]: I1205 01:45:55.974349 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:45:55 crc kubenswrapper[4759]: E1205 01:45:55.975264 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e95e408-7164-432e-b96c-2ab6bc0d859a" containerName="collect-profiles" Dec 05 01:45:55 crc kubenswrapper[4759]: I1205 01:45:55.975277 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e95e408-7164-432e-b96c-2ab6bc0d859a" containerName="collect-profiles" Dec 05 01:45:55 crc kubenswrapper[4759]: I1205 01:45:55.975511 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e95e408-7164-432e-b96c-2ab6bc0d859a" containerName="collect-profiles" Dec 05 01:45:55 crc kubenswrapper[4759]: I1205 01:45:55.977063 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:55 crc kubenswrapper[4759]: I1205 01:45:55.994633 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.125084 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.125171 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.125253 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6rt\" (UniqueName: \"kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.226942 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.227038 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6rt\" (UniqueName: \"kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.227170 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.227649 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.228143 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.246232 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6rt\" (UniqueName: \"kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt\") pod \"redhat-marketplace-t8fhp\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.323072 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:45:56 crc kubenswrapper[4759]: I1205 01:45:56.867570 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:45:57 crc kubenswrapper[4759]: I1205 01:45:57.727286 4759 generic.go:334] "Generic (PLEG): container finished" podID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerID="a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11" exitCode=0 Dec 05 01:45:57 crc kubenswrapper[4759]: I1205 01:45:57.727370 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerDied","Data":"a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11"} Dec 05 01:45:57 crc kubenswrapper[4759]: I1205 01:45:57.727734 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerStarted","Data":"5ee350cd325fdf89f34cdd14a1ff84983ccec9c8a3fec1660e21d7013961fa7d"} Dec 05 01:45:59 crc kubenswrapper[4759]: I1205 01:45:59.787686 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerStarted","Data":"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f"} Dec 05 01:46:00 crc kubenswrapper[4759]: I1205 01:46:00.807590 4759 generic.go:334] "Generic (PLEG): container finished" podID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerID="99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f" exitCode=0 Dec 05 01:46:00 crc kubenswrapper[4759]: I1205 01:46:00.807685 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerDied","Data":"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f"} Dec 05 01:46:02 crc kubenswrapper[4759]: I1205 01:46:02.827224 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerStarted","Data":"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb"} Dec 05 01:46:02 crc kubenswrapper[4759]: I1205 01:46:02.852847 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t8fhp" podStartSLOduration=4.232985185 podStartE2EDuration="7.852826317s" podCreationTimestamp="2025-12-05 01:45:55 +0000 UTC" firstStartedPulling="2025-12-05 01:45:57.729393898 +0000 UTC m=+4976.945054888" lastFinishedPulling="2025-12-05 01:46:01.34923506 +0000 UTC m=+4980.564896020" observedRunningTime="2025-12-05 01:46:02.843072521 +0000 UTC m=+4982.058733471" watchObservedRunningTime="2025-12-05 01:46:02.852826317 +0000 UTC m=+4982.068487267" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.433517 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.434053 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.434094 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.434856 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.434900 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" gracePeriod=600 Dec 05 01:46:04 crc kubenswrapper[4759]: E1205 01:46:04.557938 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.855253 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" exitCode=0 Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.855302 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805"} Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.855393 4759 scope.go:117] "RemoveContainer" containerID="6ce8dcfeed1a3217aa548a4e4b2248720f8aefade2ef5bbe94669f831255aef7" Dec 05 01:46:04 crc kubenswrapper[4759]: I1205 01:46:04.856153 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:46:04 crc kubenswrapper[4759]: E1205 01:46:04.856563 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:46:06 crc kubenswrapper[4759]: I1205 01:46:06.323390 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:06 crc kubenswrapper[4759]: I1205 01:46:06.324087 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:06 crc kubenswrapper[4759]: I1205 01:46:06.426047 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:16 crc kubenswrapper[4759]: I1205 01:46:16.386041 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:16 crc kubenswrapper[4759]: I1205 01:46:16.455002 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.032778 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t8fhp" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="registry-server" containerID="cri-o://7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" gracePeriod=2 Dec 05 01:46:17 crc kubenswrapper[4759]: E1205 01:46:17.294807 4759 log.go:32] "ExecSync cmd from runtime service failed" err=< Dec 05 01:46:17 crc kubenswrapper[4759]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Dec 05 01:46:17 crc kubenswrapper[4759]: fail startup Dec 05 01:46:17 crc kubenswrapper[4759]: , stdout: , stderr: , exit code -1 Dec 05 01:46:17 crc kubenswrapper[4759]: > containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 01:46:17 crc kubenswrapper[4759]: E1205 01:46:17.295567 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb is running failed: container process not found" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 01:46:17 crc kubenswrapper[4759]: E1205 01:46:17.295872 4759 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb is running failed: container process not found" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" cmd=["grpc_health_probe","-addr=:50051"] Dec 05 01:46:17 crc kubenswrapper[4759]: E1205 01:46:17.295909 4759 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb is running failed: container process not found" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-t8fhp" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="registry-server" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.723818 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.850710 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities\") pod \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.850878 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z6rt\" (UniqueName: \"kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt\") pod \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.851146 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content\") pod \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\" (UID: \"13a59c03-b1f7-46d9-bcdf-942d3340abd8\") " Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.851975 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities" (OuterVolumeSpecName: "utilities") pod "13a59c03-b1f7-46d9-bcdf-942d3340abd8" (UID: "13a59c03-b1f7-46d9-bcdf-942d3340abd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.862443 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt" (OuterVolumeSpecName: "kube-api-access-9z6rt") pod "13a59c03-b1f7-46d9-bcdf-942d3340abd8" (UID: "13a59c03-b1f7-46d9-bcdf-942d3340abd8"). InnerVolumeSpecName "kube-api-access-9z6rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.882855 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a59c03-b1f7-46d9-bcdf-942d3340abd8" (UID: "13a59c03-b1f7-46d9-bcdf-942d3340abd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.954011 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.954073 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z6rt\" (UniqueName: \"kubernetes.io/projected/13a59c03-b1f7-46d9-bcdf-942d3340abd8-kube-api-access-9z6rt\") on node \"crc\" DevicePath \"\"" Dec 05 01:46:17 crc kubenswrapper[4759]: I1205 01:46:17.954088 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a59c03-b1f7-46d9-bcdf-942d3340abd8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.058762 4759 generic.go:334] "Generic (PLEG): container finished" podID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" exitCode=0 Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.058819 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8fhp" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.058817 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerDied","Data":"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb"} Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.058899 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8fhp" event={"ID":"13a59c03-b1f7-46d9-bcdf-942d3340abd8","Type":"ContainerDied","Data":"5ee350cd325fdf89f34cdd14a1ff84983ccec9c8a3fec1660e21d7013961fa7d"} Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.058935 4759 scope.go:117] "RemoveContainer" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.097859 4759 scope.go:117] "RemoveContainer" containerID="99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.101572 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.121461 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8fhp"] Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.142743 4759 scope.go:117] "RemoveContainer" containerID="a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.172683 4759 scope.go:117] "RemoveContainer" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" Dec 05 01:46:18 crc kubenswrapper[4759]: E1205 01:46:18.173427 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb\": container with ID starting with 7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb not found: ID does not exist" containerID="7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.173477 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb"} err="failed to get container status \"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb\": rpc error: code = NotFound desc = could not find container \"7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb\": container with ID starting with 7a72227220652408b4e997b02ccc3d79ee648df67df5fa4c157c5f3760eb24eb not found: ID does not exist" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.173502 4759 scope.go:117] "RemoveContainer" containerID="99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f" Dec 05 01:46:18 crc kubenswrapper[4759]: E1205 01:46:18.174051 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f\": container with ID starting with 99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f not found: ID does not exist" containerID="99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.174094 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f"} err="failed to get container status \"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f\": rpc error: code = NotFound desc = could not find container \"99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f\": container with ID starting with 99a5ce45a4ddf13d81cc4be6a27813aa3c5fd36e63662a45481e1871f3068a8f not found: ID does not exist" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.174115 4759 scope.go:117] "RemoveContainer" containerID="a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11" Dec 05 01:46:18 crc kubenswrapper[4759]: E1205 01:46:18.174338 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11\": container with ID starting with a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11 not found: ID does not exist" containerID="a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11" Dec 05 01:46:18 crc kubenswrapper[4759]: I1205 01:46:18.174359 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11"} err="failed to get container status \"a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11\": rpc error: code = NotFound desc = could not find container \"a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11\": container with ID starting with a56323e060cfeb2cd84d8f20e6815eccc55cdeb150d26a823bee9ab295d48c11 not found: ID does not exist" Dec 05 01:46:19 crc kubenswrapper[4759]: I1205 01:46:19.175919 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" path="/var/lib/kubelet/pods/13a59c03-b1f7-46d9-bcdf-942d3340abd8/volumes" Dec 05 01:46:20 crc kubenswrapper[4759]: I1205 01:46:20.156959 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:46:20 crc kubenswrapper[4759]: E1205 01:46:20.157790 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:46:33 crc kubenswrapper[4759]: I1205 01:46:33.156815 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:46:33 crc kubenswrapper[4759]: E1205 01:46:33.158173 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:46:48 crc kubenswrapper[4759]: I1205 01:46:48.158016 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:46:48 crc kubenswrapper[4759]: E1205 01:46:48.159210 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:47:02 crc kubenswrapper[4759]: I1205 01:47:02.156324 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:47:02 crc kubenswrapper[4759]: E1205 01:47:02.157231 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:47:14 crc kubenswrapper[4759]: I1205 01:47:14.156455 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:47:14 crc kubenswrapper[4759]: E1205 01:47:14.157399 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:47:28 crc kubenswrapper[4759]: I1205 01:47:28.155209 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:47:28 crc kubenswrapper[4759]: E1205 01:47:28.156115 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:47:42 crc kubenswrapper[4759]: I1205 01:47:42.157458 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:47:42 crc kubenswrapper[4759]: E1205 01:47:42.160630 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:47:57 crc kubenswrapper[4759]: I1205 01:47:57.156795 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:47:57 crc kubenswrapper[4759]: E1205 01:47:57.157607 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:48:11 crc kubenswrapper[4759]: I1205 01:48:11.170713 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:48:11 crc kubenswrapper[4759]: E1205 01:48:11.172114 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:48:16 crc kubenswrapper[4759]: I1205 01:48:16.741845 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="1013063a-9cdb-47ba-8c7d-5161bbbad9d4" containerName="galera" probeResult="failure" output="command timed out" Dec 05 01:48:16 crc kubenswrapper[4759]: I1205 01:48:16.742938 4759 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="1013063a-9cdb-47ba-8c7d-5161bbbad9d4" containerName="galera" probeResult="failure" output="command timed out" Dec 05 01:48:23 crc kubenswrapper[4759]: I1205 01:48:23.156287 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:48:23 crc kubenswrapper[4759]: E1205 01:48:23.157858 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:48:37 crc kubenswrapper[4759]: I1205 01:48:37.156640 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:48:37 crc kubenswrapper[4759]: E1205 01:48:37.157405 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.973140 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:48:39 crc kubenswrapper[4759]: E1205 01:48:39.974432 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="extract-content" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.974449 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="extract-content" Dec 05 01:48:39 crc kubenswrapper[4759]: E1205 01:48:39.974462 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="extract-utilities" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.974469 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="extract-utilities" Dec 05 01:48:39 crc kubenswrapper[4759]: E1205 01:48:39.974502 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="registry-server" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.974508 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="registry-server" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.974764 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a59c03-b1f7-46d9-bcdf-942d3340abd8" containerName="registry-server" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.976454 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:39 crc kubenswrapper[4759]: I1205 01:48:39.988533 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.044084 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.044172 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4n5\" (UniqueName: \"kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.044436 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.146795 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4n5\" (UniqueName: \"kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.147195 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.147320 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.147659 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.147744 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.167340 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4n5\" (UniqueName: \"kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5\") pod \"redhat-operators-sp7q8\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.294634 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:40 crc kubenswrapper[4759]: I1205 01:48:40.875209 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:48:41 crc kubenswrapper[4759]: W1205 01:48:41.355074 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe90fcb2_c039_483b_9bd4_96ed5c508854.slice/crio-3bf2eee8b44a019c5b00c6e6a737b7c8c6132803a5fb695f1c6bda775d56a6ba WatchSource:0}: Error finding container 3bf2eee8b44a019c5b00c6e6a737b7c8c6132803a5fb695f1c6bda775d56a6ba: Status 404 returned error can't find the container with id 3bf2eee8b44a019c5b00c6e6a737b7c8c6132803a5fb695f1c6bda775d56a6ba Dec 05 01:48:41 crc kubenswrapper[4759]: I1205 01:48:41.844292 4759 generic.go:334] "Generic (PLEG): container finished" podID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerID="b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01" exitCode=0 Dec 05 01:48:41 crc kubenswrapper[4759]: I1205 01:48:41.844415 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerDied","Data":"b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01"} Dec 05 01:48:41 crc kubenswrapper[4759]: I1205 01:48:41.845284 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerStarted","Data":"3bf2eee8b44a019c5b00c6e6a737b7c8c6132803a5fb695f1c6bda775d56a6ba"} Dec 05 01:48:41 crc kubenswrapper[4759]: I1205 01:48:41.847868 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:48:42 crc kubenswrapper[4759]: I1205 01:48:42.860921 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerStarted","Data":"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b"} Dec 05 01:48:46 crc kubenswrapper[4759]: I1205 01:48:46.899574 4759 generic.go:334] "Generic (PLEG): container finished" podID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerID="539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b" exitCode=0 Dec 05 01:48:46 crc kubenswrapper[4759]: I1205 01:48:46.899671 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerDied","Data":"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b"} Dec 05 01:48:47 crc kubenswrapper[4759]: I1205 01:48:47.917059 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerStarted","Data":"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a"} Dec 05 01:48:47 crc kubenswrapper[4759]: I1205 01:48:47.940739 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sp7q8" podStartSLOduration=3.480060552 podStartE2EDuration="8.940717054s" podCreationTimestamp="2025-12-05 01:48:39 +0000 UTC" firstStartedPulling="2025-12-05 01:48:41.847472016 +0000 UTC m=+5141.063132976" lastFinishedPulling="2025-12-05 01:48:47.308128518 +0000 UTC m=+5146.523789478" observedRunningTime="2025-12-05 01:48:47.936960802 +0000 UTC m=+5147.152621762" watchObservedRunningTime="2025-12-05 01:48:47.940717054 +0000 UTC m=+5147.156378034" Dec 05 01:48:50 crc kubenswrapper[4759]: I1205 01:48:50.295771 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:50 crc kubenswrapper[4759]: I1205 01:48:50.296381 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:48:51 crc kubenswrapper[4759]: I1205 01:48:51.348999 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sp7q8" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="registry-server" probeResult="failure" output=< Dec 05 01:48:51 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:48:51 crc kubenswrapper[4759]: > Dec 05 01:48:52 crc kubenswrapper[4759]: I1205 01:48:52.156733 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:48:52 crc kubenswrapper[4759]: E1205 01:48:52.157747 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:49:00 crc kubenswrapper[4759]: I1205 01:49:00.393683 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:49:00 crc kubenswrapper[4759]: I1205 01:49:00.498545 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:49:00 crc kubenswrapper[4759]: I1205 01:49:00.638529 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.100854 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sp7q8" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="registry-server" containerID="cri-o://75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a" gracePeriod=2 Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.796269 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.982151 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities\") pod \"fe90fcb2-c039-483b-9bd4-96ed5c508854\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.982475 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4n5\" (UniqueName: \"kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5\") pod \"fe90fcb2-c039-483b-9bd4-96ed5c508854\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.982544 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content\") pod \"fe90fcb2-c039-483b-9bd4-96ed5c508854\" (UID: \"fe90fcb2-c039-483b-9bd4-96ed5c508854\") " Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.982868 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities" (OuterVolumeSpecName: "utilities") pod "fe90fcb2-c039-483b-9bd4-96ed5c508854" (UID: "fe90fcb2-c039-483b-9bd4-96ed5c508854"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.983647 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:02 crc kubenswrapper[4759]: I1205 01:49:02.995060 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5" (OuterVolumeSpecName: "kube-api-access-zg4n5") pod "fe90fcb2-c039-483b-9bd4-96ed5c508854" (UID: "fe90fcb2-c039-483b-9bd4-96ed5c508854"). InnerVolumeSpecName "kube-api-access-zg4n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.087241 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4n5\" (UniqueName: \"kubernetes.io/projected/fe90fcb2-c039-483b-9bd4-96ed5c508854-kube-api-access-zg4n5\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.109768 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe90fcb2-c039-483b-9bd4-96ed5c508854" (UID: "fe90fcb2-c039-483b-9bd4-96ed5c508854"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.141927 4759 generic.go:334] "Generic (PLEG): container finished" podID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerID="75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a" exitCode=0 Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.141966 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerDied","Data":"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a"} Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.141993 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp7q8" event={"ID":"fe90fcb2-c039-483b-9bd4-96ed5c508854","Type":"ContainerDied","Data":"3bf2eee8b44a019c5b00c6e6a737b7c8c6132803a5fb695f1c6bda775d56a6ba"} Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.142012 4759 scope.go:117] "RemoveContainer" containerID="75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.142148 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp7q8" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.205905 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.207081 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe90fcb2-c039-483b-9bd4-96ed5c508854-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.217260 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sp7q8"] Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.230546 4759 scope.go:117] "RemoveContainer" containerID="539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.272882 4759 scope.go:117] "RemoveContainer" containerID="b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.330691 4759 scope.go:117] "RemoveContainer" containerID="75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a" Dec 05 01:49:03 crc kubenswrapper[4759]: E1205 01:49:03.331175 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a\": container with ID starting with 75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a not found: ID does not exist" containerID="75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.331207 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a"} err="failed to get container status \"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a\": rpc error: code = NotFound desc = could not find container \"75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a\": container with ID starting with 75073b79963b6a1ef4e9cb71fe7868919ded50d93325cd920e455ef22d6e108a not found: ID does not exist" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.331227 4759 scope.go:117] "RemoveContainer" containerID="539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b" Dec 05 01:49:03 crc kubenswrapper[4759]: E1205 01:49:03.331467 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b\": container with ID starting with 539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b not found: ID does not exist" containerID="539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.331502 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b"} err="failed to get container status \"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b\": rpc error: code = NotFound desc = could not find container \"539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b\": container with ID starting with 539a105d8066ce964748f2c417a4f145e000b8764936cbcd7cc46fc9c6f0952b not found: ID does not exist" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.331516 4759 scope.go:117] "RemoveContainer" containerID="b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01" Dec 05 01:49:03 crc kubenswrapper[4759]: E1205 01:49:03.332486 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01\": container with ID starting with b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01 not found: ID does not exist" containerID="b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01" Dec 05 01:49:03 crc kubenswrapper[4759]: I1205 01:49:03.332510 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01"} err="failed to get container status \"b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01\": rpc error: code = NotFound desc = could not find container \"b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01\": container with ID starting with b0438397c7f1593ffec212eaeea7f11fc89f5d9ba2b992e6d4212592860c6b01 not found: ID does not exist" Dec 05 01:49:04 crc kubenswrapper[4759]: I1205 01:49:04.157783 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:49:04 crc kubenswrapper[4759]: E1205 01:49:04.158704 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:49:05 crc kubenswrapper[4759]: I1205 01:49:05.179785 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" path="/var/lib/kubelet/pods/fe90fcb2-c039-483b-9bd4-96ed5c508854/volumes" Dec 05 01:49:17 crc kubenswrapper[4759]: I1205 01:49:17.157238 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:49:17 crc kubenswrapper[4759]: E1205 01:49:17.158011 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.224614 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:28 crc kubenswrapper[4759]: E1205 01:49:28.225519 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="registry-server" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.225533 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="registry-server" Dec 05 01:49:28 crc kubenswrapper[4759]: E1205 01:49:28.225560 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="extract-utilities" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.225567 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="extract-utilities" Dec 05 01:49:28 crc kubenswrapper[4759]: E1205 01:49:28.225599 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="extract-content" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.225606 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="extract-content" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.225804 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe90fcb2-c039-483b-9bd4-96ed5c508854" containerName="registry-server" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.227408 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.243453 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.243542 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.243637 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hl4m\" (UniqueName: \"kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.243662 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.345465 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.345796 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.345943 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hl4m\" (UniqueName: \"kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.346138 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.346289 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.369271 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hl4m\" (UniqueName: \"kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m\") pod \"certified-operators-9lf79\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:28 crc kubenswrapper[4759]: I1205 01:49:28.546784 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:29 crc kubenswrapper[4759]: I1205 01:49:29.047509 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:29 crc kubenswrapper[4759]: I1205 01:49:29.157255 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:49:29 crc kubenswrapper[4759]: E1205 01:49:29.157505 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:49:30 crc kubenswrapper[4759]: I1205 01:49:30.489174 4759 generic.go:334] "Generic (PLEG): container finished" podID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerID="b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1" exitCode=0 Dec 05 01:49:30 crc kubenswrapper[4759]: I1205 01:49:30.489331 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerDied","Data":"b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1"} Dec 05 01:49:30 crc kubenswrapper[4759]: I1205 01:49:30.490792 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerStarted","Data":"35178a320073710f61c14d7caefde67b745d561f584cc9fe7ff8ce14551e3162"} Dec 05 01:49:31 crc kubenswrapper[4759]: I1205 01:49:31.506155 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerStarted","Data":"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7"} Dec 05 01:49:32 crc kubenswrapper[4759]: I1205 01:49:32.521758 4759 generic.go:334] "Generic (PLEG): container finished" podID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerID="b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7" exitCode=0 Dec 05 01:49:32 crc kubenswrapper[4759]: I1205 01:49:32.521887 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerDied","Data":"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7"} Dec 05 01:49:33 crc kubenswrapper[4759]: I1205 01:49:33.535416 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerStarted","Data":"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844"} Dec 05 01:49:33 crc kubenswrapper[4759]: I1205 01:49:33.559174 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lf79" podStartSLOduration=3.055245167 podStartE2EDuration="5.559155764s" podCreationTimestamp="2025-12-05 01:49:28 +0000 UTC" firstStartedPulling="2025-12-05 01:49:30.49253492 +0000 UTC m=+5189.708195860" lastFinishedPulling="2025-12-05 01:49:32.996445507 +0000 UTC m=+5192.212106457" observedRunningTime="2025-12-05 01:49:33.55368111 +0000 UTC m=+5192.769342090" watchObservedRunningTime="2025-12-05 01:49:33.559155764 +0000 UTC m=+5192.774816714" Dec 05 01:49:38 crc kubenswrapper[4759]: I1205 01:49:38.547498 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:38 crc kubenswrapper[4759]: I1205 01:49:38.548042 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:38 crc kubenswrapper[4759]: I1205 01:49:38.620889 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:38 crc kubenswrapper[4759]: I1205 01:49:38.686369 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:38 crc kubenswrapper[4759]: I1205 01:49:38.873453 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:40 crc kubenswrapper[4759]: I1205 01:49:40.632178 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lf79" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="registry-server" containerID="cri-o://c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844" gracePeriod=2 Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.309415 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.432726 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hl4m\" (UniqueName: \"kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m\") pod \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.432849 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities\") pod \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.432890 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content\") pod \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\" (UID: \"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954\") " Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.433973 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities" (OuterVolumeSpecName: "utilities") pod "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" (UID: "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.439526 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m" (OuterVolumeSpecName: "kube-api-access-6hl4m") pod "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" (UID: "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954"). InnerVolumeSpecName "kube-api-access-6hl4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.480216 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" (UID: "bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.536131 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hl4m\" (UniqueName: \"kubernetes.io/projected/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-kube-api-access-6hl4m\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.536181 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.536200 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.644153 4759 generic.go:334] "Generic (PLEG): container finished" podID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerID="c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844" exitCode=0 Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.644206 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lf79" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.644204 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerDied","Data":"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844"} Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.644915 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lf79" event={"ID":"bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954","Type":"ContainerDied","Data":"35178a320073710f61c14d7caefde67b745d561f584cc9fe7ff8ce14551e3162"} Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.644939 4759 scope.go:117] "RemoveContainer" containerID="c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.665887 4759 scope.go:117] "RemoveContainer" containerID="b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.706163 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.706955 4759 scope.go:117] "RemoveContainer" containerID="b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.717893 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lf79"] Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.761812 4759 scope.go:117] "RemoveContainer" containerID="c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844" Dec 05 01:49:41 crc kubenswrapper[4759]: E1205 01:49:41.762414 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844\": container with ID starting with c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844 not found: ID does not exist" containerID="c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.762457 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844"} err="failed to get container status \"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844\": rpc error: code = NotFound desc = could not find container \"c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844\": container with ID starting with c803235e355fb6872fe855c68868b9368eb5b396bed6ceba0733b560ab6ea844 not found: ID does not exist" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.762485 4759 scope.go:117] "RemoveContainer" containerID="b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7" Dec 05 01:49:41 crc kubenswrapper[4759]: E1205 01:49:41.762751 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7\": container with ID starting with b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7 not found: ID does not exist" containerID="b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.762794 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7"} err="failed to get container status \"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7\": rpc error: code = NotFound desc = could not find container \"b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7\": container with ID starting with b47dd0d534fc6327cf8a80483a874036a46202831efe72b88421f71b50c26ae7 not found: ID does not exist" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.762821 4759 scope.go:117] "RemoveContainer" containerID="b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1" Dec 05 01:49:41 crc kubenswrapper[4759]: E1205 01:49:41.763083 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1\": container with ID starting with b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1 not found: ID does not exist" containerID="b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1" Dec 05 01:49:41 crc kubenswrapper[4759]: I1205 01:49:41.763114 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1"} err="failed to get container status \"b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1\": rpc error: code = NotFound desc = could not find container \"b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1\": container with ID starting with b1fe80c48e06c4d1691f5df7a5040171c3e983f5ddf6d1ce08ee07d2e1f5d9d1 not found: ID does not exist" Dec 05 01:49:43 crc kubenswrapper[4759]: I1205 01:49:43.182292 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" path="/var/lib/kubelet/pods/bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954/volumes" Dec 05 01:49:44 crc kubenswrapper[4759]: I1205 01:49:44.157023 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:49:44 crc kubenswrapper[4759]: E1205 01:49:44.158016 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:49:56 crc kubenswrapper[4759]: I1205 01:49:56.156153 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:49:56 crc kubenswrapper[4759]: E1205 01:49:56.157221 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:50:09 crc kubenswrapper[4759]: I1205 01:50:09.159106 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:50:09 crc kubenswrapper[4759]: E1205 01:50:09.160183 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:50:22 crc kubenswrapper[4759]: I1205 01:50:22.156711 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:50:22 crc kubenswrapper[4759]: E1205 01:50:22.158242 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:50:33 crc kubenswrapper[4759]: I1205 01:50:33.157494 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:50:33 crc kubenswrapper[4759]: E1205 01:50:33.158887 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:50:47 crc kubenswrapper[4759]: I1205 01:50:47.157821 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:50:47 crc kubenswrapper[4759]: E1205 01:50:47.158965 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:51:00 crc kubenswrapper[4759]: I1205 01:51:00.156884 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:51:00 crc kubenswrapper[4759]: E1205 01:51:00.158294 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:51:14 crc kubenswrapper[4759]: I1205 01:51:14.155640 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:51:15 crc kubenswrapper[4759]: I1205 01:51:15.003936 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a"} Dec 05 01:52:18 crc kubenswrapper[4759]: I1205 01:52:18.083723 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-fa0a-account-create-update-x559s"] Dec 05 01:52:18 crc kubenswrapper[4759]: I1205 01:52:18.102493 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-4rlm4"] Dec 05 01:52:18 crc kubenswrapper[4759]: I1205 01:52:18.115359 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-4rlm4"] Dec 05 01:52:18 crc kubenswrapper[4759]: I1205 01:52:18.127728 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-fa0a-account-create-update-x559s"] Dec 05 01:52:19 crc kubenswrapper[4759]: I1205 01:52:19.176644 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996d6f16-b82f-4780-8d29-e26f633bd570" path="/var/lib/kubelet/pods/996d6f16-b82f-4780-8d29-e26f633bd570/volumes" Dec 05 01:52:19 crc kubenswrapper[4759]: I1205 01:52:19.178979 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b8713d-0512-4172-ad78-b7bc92a1d9ba" path="/var/lib/kubelet/pods/c6b8713d-0512-4172-ad78-b7bc92a1d9ba/volumes" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.857651 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:34 crc kubenswrapper[4759]: E1205 01:52:34.859137 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="extract-utilities" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.859161 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="extract-utilities" Dec 05 01:52:34 crc kubenswrapper[4759]: E1205 01:52:34.859179 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="registry-server" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.859190 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="registry-server" Dec 05 01:52:34 crc kubenswrapper[4759]: E1205 01:52:34.859220 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="extract-content" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.859230 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="extract-content" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.859892 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf813cb2-9d8d-4fd8-aa1f-b6619d1b3954" containerName="registry-server" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.862797 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:34 crc kubenswrapper[4759]: I1205 01:52:34.871380 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.032699 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.034210 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrr8c\" (UniqueName: \"kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.034712 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.136813 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.137233 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrr8c\" (UniqueName: \"kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.137370 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.137700 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.137830 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.164298 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrr8c\" (UniqueName: \"kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c\") pod \"community-operators-mg8x9\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.186249 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.632002 4759 scope.go:117] "RemoveContainer" containerID="897967f4179143f090e9c302eccda8369c9d83a84f00e32eac29b842085f4d00" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.674153 4759 scope.go:117] "RemoveContainer" containerID="898cbc303a30965158b622e8d7400a1171c14af6e3e937db7aae492df35dcf51" Dec 05 01:52:35 crc kubenswrapper[4759]: I1205 01:52:35.758643 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:36 crc kubenswrapper[4759]: I1205 01:52:36.068360 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerStarted","Data":"82e292996dd969ee87596495716fbb8b3cb30e5b1748bf0275f3676ad5977b9b"} Dec 05 01:52:37 crc kubenswrapper[4759]: I1205 01:52:37.081458 4759 generic.go:334] "Generic (PLEG): container finished" podID="cdca6183-5daf-4518-a297-546ee84a120e" containerID="d9becff9f8d469bb182903a2994b77f9c071dcaa6d445ff84af5b67f83dd1eb4" exitCode=0 Dec 05 01:52:37 crc kubenswrapper[4759]: I1205 01:52:37.081519 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerDied","Data":"d9becff9f8d469bb182903a2994b77f9c071dcaa6d445ff84af5b67f83dd1eb4"} Dec 05 01:52:38 crc kubenswrapper[4759]: I1205 01:52:38.096483 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerStarted","Data":"7b7d89b5909670ae770353b198077174ae3064b8d2b17d97c203a787e399e5b9"} Dec 05 01:52:40 crc kubenswrapper[4759]: I1205 01:52:40.133141 4759 generic.go:334] "Generic (PLEG): container finished" podID="cdca6183-5daf-4518-a297-546ee84a120e" containerID="7b7d89b5909670ae770353b198077174ae3064b8d2b17d97c203a787e399e5b9" exitCode=0 Dec 05 01:52:40 crc kubenswrapper[4759]: I1205 01:52:40.133256 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerDied","Data":"7b7d89b5909670ae770353b198077174ae3064b8d2b17d97c203a787e399e5b9"} Dec 05 01:52:41 crc kubenswrapper[4759]: I1205 01:52:41.149980 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerStarted","Data":"89f7428bb1b0a645b9788c5a5a351158d439aa582c4b3af789a103dd2d99406f"} Dec 05 01:52:41 crc kubenswrapper[4759]: I1205 01:52:41.195951 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mg8x9" podStartSLOduration=3.774696236 podStartE2EDuration="7.195914765s" podCreationTimestamp="2025-12-05 01:52:34 +0000 UTC" firstStartedPulling="2025-12-05 01:52:37.083409564 +0000 UTC m=+5376.299070514" lastFinishedPulling="2025-12-05 01:52:40.504628083 +0000 UTC m=+5379.720289043" observedRunningTime="2025-12-05 01:52:41.176745646 +0000 UTC m=+5380.392406596" watchObservedRunningTime="2025-12-05 01:52:41.195914765 +0000 UTC m=+5380.411575755" Dec 05 01:52:45 crc kubenswrapper[4759]: I1205 01:52:45.189568 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:45 crc kubenswrapper[4759]: I1205 01:52:45.190275 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:45 crc kubenswrapper[4759]: I1205 01:52:45.274783 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:45 crc kubenswrapper[4759]: I1205 01:52:45.342232 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:45 crc kubenswrapper[4759]: I1205 01:52:45.519568 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:47 crc kubenswrapper[4759]: I1205 01:52:47.254568 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mg8x9" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="registry-server" containerID="cri-o://89f7428bb1b0a645b9788c5a5a351158d439aa582c4b3af789a103dd2d99406f" gracePeriod=2 Dec 05 01:52:48 crc kubenswrapper[4759]: I1205 01:52:48.064977 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-ppb4v"] Dec 05 01:52:48 crc kubenswrapper[4759]: I1205 01:52:48.078017 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-ppb4v"] Dec 05 01:52:48 crc kubenswrapper[4759]: I1205 01:52:48.273863 4759 generic.go:334] "Generic (PLEG): container finished" podID="cdca6183-5daf-4518-a297-546ee84a120e" containerID="89f7428bb1b0a645b9788c5a5a351158d439aa582c4b3af789a103dd2d99406f" exitCode=0 Dec 05 01:52:48 crc kubenswrapper[4759]: I1205 01:52:48.273915 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerDied","Data":"89f7428bb1b0a645b9788c5a5a351158d439aa582c4b3af789a103dd2d99406f"} Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.172227 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6457d70c-b3df-4b29-9d07-9f4ebc24b1b6" path="/var/lib/kubelet/pods/6457d70c-b3df-4b29-9d07-9f4ebc24b1b6/volumes" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.286182 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mg8x9" event={"ID":"cdca6183-5daf-4518-a297-546ee84a120e","Type":"ContainerDied","Data":"82e292996dd969ee87596495716fbb8b3cb30e5b1748bf0275f3676ad5977b9b"} Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.286457 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e292996dd969ee87596495716fbb8b3cb30e5b1748bf0275f3676ad5977b9b" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.289450 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.338745 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content\") pod \"cdca6183-5daf-4518-a297-546ee84a120e\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.339245 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrr8c\" (UniqueName: \"kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c\") pod \"cdca6183-5daf-4518-a297-546ee84a120e\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.339438 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities\") pod \"cdca6183-5daf-4518-a297-546ee84a120e\" (UID: \"cdca6183-5daf-4518-a297-546ee84a120e\") " Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.341409 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities" (OuterVolumeSpecName: "utilities") pod "cdca6183-5daf-4518-a297-546ee84a120e" (UID: "cdca6183-5daf-4518-a297-546ee84a120e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.351899 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c" (OuterVolumeSpecName: "kube-api-access-zrr8c") pod "cdca6183-5daf-4518-a297-546ee84a120e" (UID: "cdca6183-5daf-4518-a297-546ee84a120e"). InnerVolumeSpecName "kube-api-access-zrr8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.418225 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdca6183-5daf-4518-a297-546ee84a120e" (UID: "cdca6183-5daf-4518-a297-546ee84a120e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.442526 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.442564 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrr8c\" (UniqueName: \"kubernetes.io/projected/cdca6183-5daf-4518-a297-546ee84a120e-kube-api-access-zrr8c\") on node \"crc\" DevicePath \"\"" Dec 05 01:52:49 crc kubenswrapper[4759]: I1205 01:52:49.442576 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdca6183-5daf-4518-a297-546ee84a120e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:52:50 crc kubenswrapper[4759]: I1205 01:52:50.299895 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mg8x9" Dec 05 01:52:50 crc kubenswrapper[4759]: I1205 01:52:50.368363 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:50 crc kubenswrapper[4759]: I1205 01:52:50.384905 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mg8x9"] Dec 05 01:52:51 crc kubenswrapper[4759]: I1205 01:52:51.171915 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdca6183-5daf-4518-a297-546ee84a120e" path="/var/lib/kubelet/pods/cdca6183-5daf-4518-a297-546ee84a120e/volumes" Dec 05 01:53:34 crc kubenswrapper[4759]: I1205 01:53:34.433147 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:53:34 crc kubenswrapper[4759]: I1205 01:53:34.434105 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:53:35 crc kubenswrapper[4759]: I1205 01:53:35.872150 4759 scope.go:117] "RemoveContainer" containerID="149e653f61d81e82c03299e1eabddf754805cd1e7e74616750366bb31a2d95ab" Dec 05 01:54:04 crc kubenswrapper[4759]: I1205 01:54:04.433249 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:54:04 crc kubenswrapper[4759]: I1205 01:54:04.433809 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.433136 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.433875 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.433945 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.435107 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.435206 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a" gracePeriod=600 Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.694120 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a" exitCode=0 Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.694234 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a"} Dec 05 01:54:34 crc kubenswrapper[4759]: I1205 01:54:34.694493 4759 scope.go:117] "RemoveContainer" containerID="ab819202d3c95243b5bd5690096b0791bd8bc1586b08f1fd8cbfcadeba03a805" Dec 05 01:54:35 crc kubenswrapper[4759]: I1205 01:54:35.712450 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63"} Dec 05 01:55:55 crc kubenswrapper[4759]: E1205 01:55:55.814978 4759 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.150:56920->38.102.83.150:42699: write tcp 38.102.83.150:56920->38.102.83.150:42699: write: broken pipe Dec 05 01:56:34 crc kubenswrapper[4759]: I1205 01:56:34.434041 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:56:34 crc kubenswrapper[4759]: I1205 01:56:34.436732 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.980957 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:56:47 crc kubenswrapper[4759]: E1205 01:56:47.982146 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="registry-server" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.982164 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="registry-server" Dec 05 01:56:47 crc kubenswrapper[4759]: E1205 01:56:47.982222 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="extract-utilities" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.982231 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="extract-utilities" Dec 05 01:56:47 crc kubenswrapper[4759]: E1205 01:56:47.982260 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="extract-content" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.982269 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="extract-content" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.982557 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdca6183-5daf-4518-a297-546ee84a120e" containerName="registry-server" Dec 05 01:56:47 crc kubenswrapper[4759]: I1205 01:56:47.984637 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.025417 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.106476 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.106846 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.106930 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.209049 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.209090 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.209199 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.209629 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.209823 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.229714 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6\") pod \"redhat-marketplace-s6v4j\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.358873 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:48 crc kubenswrapper[4759]: I1205 01:56:48.895900 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:56:49 crc kubenswrapper[4759]: I1205 01:56:49.365501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerStarted","Data":"2ab2bc8a8d4e6870bd8da406d61c2d3f11293008b3e88871764fe96267766ee7"} Dec 05 01:56:50 crc kubenswrapper[4759]: I1205 01:56:50.382580 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerID="41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb" exitCode=0 Dec 05 01:56:50 crc kubenswrapper[4759]: I1205 01:56:50.382709 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerDied","Data":"41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb"} Dec 05 01:56:50 crc kubenswrapper[4759]: I1205 01:56:50.385171 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 01:56:52 crc kubenswrapper[4759]: I1205 01:56:52.405183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerStarted","Data":"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39"} Dec 05 01:56:53 crc kubenswrapper[4759]: I1205 01:56:53.422610 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerID="933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39" exitCode=0 Dec 05 01:56:53 crc kubenswrapper[4759]: I1205 01:56:53.422897 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerDied","Data":"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39"} Dec 05 01:56:54 crc kubenswrapper[4759]: I1205 01:56:54.440200 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerStarted","Data":"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1"} Dec 05 01:56:54 crc kubenswrapper[4759]: I1205 01:56:54.474551 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s6v4j" podStartSLOduration=4.040486744 podStartE2EDuration="7.474521832s" podCreationTimestamp="2025-12-05 01:56:47 +0000 UTC" firstStartedPulling="2025-12-05 01:56:50.384543123 +0000 UTC m=+5629.600204113" lastFinishedPulling="2025-12-05 01:56:53.818578251 +0000 UTC m=+5633.034239201" observedRunningTime="2025-12-05 01:56:54.461579222 +0000 UTC m=+5633.677240232" watchObservedRunningTime="2025-12-05 01:56:54.474521832 +0000 UTC m=+5633.690182812" Dec 05 01:56:58 crc kubenswrapper[4759]: I1205 01:56:58.359094 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:58 crc kubenswrapper[4759]: I1205 01:56:58.359793 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:56:58 crc kubenswrapper[4759]: I1205 01:56:58.418749 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:57:04 crc kubenswrapper[4759]: I1205 01:57:04.433115 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:57:04 crc kubenswrapper[4759]: I1205 01:57:04.433655 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:57:08 crc kubenswrapper[4759]: I1205 01:57:08.434698 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:57:08 crc kubenswrapper[4759]: I1205 01:57:08.497154 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:57:08 crc kubenswrapper[4759]: I1205 01:57:08.604230 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s6v4j" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="registry-server" containerID="cri-o://f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1" gracePeriod=2 Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.153404 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.322851 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities\") pod \"3e64221c-5543-47e2-8bd5-755ec0744c68\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.323007 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content\") pod \"3e64221c-5543-47e2-8bd5-755ec0744c68\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.323076 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6\") pod \"3e64221c-5543-47e2-8bd5-755ec0744c68\" (UID: \"3e64221c-5543-47e2-8bd5-755ec0744c68\") " Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.325045 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities" (OuterVolumeSpecName: "utilities") pod "3e64221c-5543-47e2-8bd5-755ec0744c68" (UID: "3e64221c-5543-47e2-8bd5-755ec0744c68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.332002 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6" (OuterVolumeSpecName: "kube-api-access-t54x6") pod "3e64221c-5543-47e2-8bd5-755ec0744c68" (UID: "3e64221c-5543-47e2-8bd5-755ec0744c68"). InnerVolumeSpecName "kube-api-access-t54x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.345707 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e64221c-5543-47e2-8bd5-755ec0744c68" (UID: "3e64221c-5543-47e2-8bd5-755ec0744c68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.426846 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.430941 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e64221c-5543-47e2-8bd5-755ec0744c68-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.431167 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/3e64221c-5543-47e2-8bd5-755ec0744c68-kube-api-access-t54x6\") on node \"crc\" DevicePath \"\"" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.618613 4759 generic.go:334] "Generic (PLEG): container finished" podID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerID="f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1" exitCode=0 Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.618744 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6v4j" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.619118 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerDied","Data":"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1"} Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.620626 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6v4j" event={"ID":"3e64221c-5543-47e2-8bd5-755ec0744c68","Type":"ContainerDied","Data":"2ab2bc8a8d4e6870bd8da406d61c2d3f11293008b3e88871764fe96267766ee7"} Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.620654 4759 scope.go:117] "RemoveContainer" containerID="f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.659774 4759 scope.go:117] "RemoveContainer" containerID="933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.670278 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.693184 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6v4j"] Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.703621 4759 scope.go:117] "RemoveContainer" containerID="41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.767049 4759 scope.go:117] "RemoveContainer" containerID="f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1" Dec 05 01:57:09 crc kubenswrapper[4759]: E1205 01:57:09.767631 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1\": container with ID starting with f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1 not found: ID does not exist" containerID="f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.767668 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1"} err="failed to get container status \"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1\": rpc error: code = NotFound desc = could not find container \"f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1\": container with ID starting with f5ac601ebedc9a85a7e1a29904680a7f4c56227b99a6b42a2dbadc59aa266cf1 not found: ID does not exist" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.767695 4759 scope.go:117] "RemoveContainer" containerID="933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39" Dec 05 01:57:09 crc kubenswrapper[4759]: E1205 01:57:09.768005 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39\": container with ID starting with 933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39 not found: ID does not exist" containerID="933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.768033 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39"} err="failed to get container status \"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39\": rpc error: code = NotFound desc = could not find container \"933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39\": container with ID starting with 933e2664d9996633ed9aedf5f3871cb70126f5ca78a13c967074cb8be13e5e39 not found: ID does not exist" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.768052 4759 scope.go:117] "RemoveContainer" containerID="41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb" Dec 05 01:57:09 crc kubenswrapper[4759]: E1205 01:57:09.768339 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb\": container with ID starting with 41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb not found: ID does not exist" containerID="41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb" Dec 05 01:57:09 crc kubenswrapper[4759]: I1205 01:57:09.768376 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb"} err="failed to get container status \"41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb\": rpc error: code = NotFound desc = could not find container \"41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb\": container with ID starting with 41cc88ab3d3dfc59625e6b143fefe5c871b32665e7d0e9aac673a166b9793ceb not found: ID does not exist" Dec 05 01:57:11 crc kubenswrapper[4759]: I1205 01:57:11.177938 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" path="/var/lib/kubelet/pods/3e64221c-5543-47e2-8bd5-755ec0744c68/volumes" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.433976 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.434788 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.434854 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.436019 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.436135 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" gracePeriod=600 Dec 05 01:57:34 crc kubenswrapper[4759]: E1205 01:57:34.580657 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.947424 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" exitCode=0 Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.947462 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63"} Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.947520 4759 scope.go:117] "RemoveContainer" containerID="c92a2a5f9d4fcaf0294f1cc475ee4a62d65abf003d584e80bb85cd7b26596c3a" Dec 05 01:57:34 crc kubenswrapper[4759]: I1205 01:57:34.948771 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:57:34 crc kubenswrapper[4759]: E1205 01:57:34.949598 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:57:48 crc kubenswrapper[4759]: I1205 01:57:48.156733 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:57:48 crc kubenswrapper[4759]: E1205 01:57:48.157928 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:58:03 crc kubenswrapper[4759]: I1205 01:58:03.157853 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:58:03 crc kubenswrapper[4759]: E1205 01:58:03.158893 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:58:18 crc kubenswrapper[4759]: I1205 01:58:18.156298 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:58:18 crc kubenswrapper[4759]: E1205 01:58:18.157125 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:58:31 crc kubenswrapper[4759]: I1205 01:58:31.178504 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:58:31 crc kubenswrapper[4759]: E1205 01:58:31.179765 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:58:36 crc kubenswrapper[4759]: I1205 01:58:36.064712 4759 scope.go:117] "RemoveContainer" containerID="d9becff9f8d469bb182903a2994b77f9c071dcaa6d445ff84af5b67f83dd1eb4" Dec 05 01:58:46 crc kubenswrapper[4759]: I1205 01:58:46.157369 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:58:46 crc kubenswrapper[4759]: E1205 01:58:46.158426 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:58:57 crc kubenswrapper[4759]: I1205 01:58:57.163234 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:58:57 crc kubenswrapper[4759]: E1205 01:58:57.164222 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:59:11 crc kubenswrapper[4759]: I1205 01:59:11.163670 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:59:11 crc kubenswrapper[4759]: E1205 01:59:11.164505 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:59:22 crc kubenswrapper[4759]: I1205 01:59:22.156539 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:59:22 crc kubenswrapper[4759]: E1205 01:59:22.157454 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.913351 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:28 crc kubenswrapper[4759]: E1205 01:59:28.914336 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="registry-server" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.914350 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="registry-server" Dec 05 01:59:28 crc kubenswrapper[4759]: E1205 01:59:28.914368 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="extract-utilities" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.914374 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="extract-utilities" Dec 05 01:59:28 crc kubenswrapper[4759]: E1205 01:59:28.914388 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="extract-content" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.914394 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="extract-content" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.914614 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e64221c-5543-47e2-8bd5-755ec0744c68" containerName="registry-server" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.916205 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:28 crc kubenswrapper[4759]: I1205 01:59:28.949128 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.038505 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.038794 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4w9\" (UniqueName: \"kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.038840 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.140760 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.140818 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4w9\" (UniqueName: \"kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.140855 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.141472 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.141499 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.166418 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4w9\" (UniqueName: \"kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9\") pod \"redhat-operators-cjwnt\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.246805 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:29 crc kubenswrapper[4759]: I1205 01:59:29.808529 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:30 crc kubenswrapper[4759]: I1205 01:59:30.675485 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerID="21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807" exitCode=0 Dec 05 01:59:30 crc kubenswrapper[4759]: I1205 01:59:30.675656 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerDied","Data":"21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807"} Dec 05 01:59:30 crc kubenswrapper[4759]: I1205 01:59:30.675962 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerStarted","Data":"f63633136a16578b7e043df718088df5fc5322ec457c5e61594df5163fd08cce"} Dec 05 01:59:31 crc kubenswrapper[4759]: I1205 01:59:31.687136 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerStarted","Data":"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b"} Dec 05 01:59:34 crc kubenswrapper[4759]: I1205 01:59:34.724019 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerID="8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b" exitCode=0 Dec 05 01:59:34 crc kubenswrapper[4759]: I1205 01:59:34.724091 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerDied","Data":"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b"} Dec 05 01:59:35 crc kubenswrapper[4759]: I1205 01:59:35.738408 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerStarted","Data":"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a"} Dec 05 01:59:35 crc kubenswrapper[4759]: I1205 01:59:35.767629 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cjwnt" podStartSLOduration=3.2974222109999998 podStartE2EDuration="7.767605158s" podCreationTimestamp="2025-12-05 01:59:28 +0000 UTC" firstStartedPulling="2025-12-05 01:59:30.680490176 +0000 UTC m=+5789.896151126" lastFinishedPulling="2025-12-05 01:59:35.150673123 +0000 UTC m=+5794.366334073" observedRunningTime="2025-12-05 01:59:35.756675777 +0000 UTC m=+5794.972336737" watchObservedRunningTime="2025-12-05 01:59:35.767605158 +0000 UTC m=+5794.983266108" Dec 05 01:59:36 crc kubenswrapper[4759]: I1205 01:59:36.123164 4759 scope.go:117] "RemoveContainer" containerID="89f7428bb1b0a645b9788c5a5a351158d439aa582c4b3af789a103dd2d99406f" Dec 05 01:59:36 crc kubenswrapper[4759]: I1205 01:59:36.148847 4759 scope.go:117] "RemoveContainer" containerID="7b7d89b5909670ae770353b198077174ae3064b8d2b17d97c203a787e399e5b9" Dec 05 01:59:37 crc kubenswrapper[4759]: I1205 01:59:37.161198 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:59:37 crc kubenswrapper[4759]: E1205 01:59:37.161857 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:59:39 crc kubenswrapper[4759]: I1205 01:59:39.248032 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:39 crc kubenswrapper[4759]: I1205 01:59:39.248429 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:40 crc kubenswrapper[4759]: I1205 01:59:40.310765 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cjwnt" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="registry-server" probeResult="failure" output=< Dec 05 01:59:40 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 01:59:40 crc kubenswrapper[4759]: > Dec 05 01:59:48 crc kubenswrapper[4759]: I1205 01:59:48.156896 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:59:48 crc kubenswrapper[4759]: E1205 01:59:48.158256 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 01:59:49 crc kubenswrapper[4759]: I1205 01:59:49.320124 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:49 crc kubenswrapper[4759]: I1205 01:59:49.382542 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:49 crc kubenswrapper[4759]: I1205 01:59:49.566649 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:50 crc kubenswrapper[4759]: I1205 01:59:50.991903 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cjwnt" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="registry-server" containerID="cri-o://467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a" gracePeriod=2 Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.623781 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.690934 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj4w9\" (UniqueName: \"kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9\") pod \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.691014 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities\") pod \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.691112 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content\") pod \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\" (UID: \"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a\") " Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.692748 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities" (OuterVolumeSpecName: "utilities") pod "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" (UID: "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.701653 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9" (OuterVolumeSpecName: "kube-api-access-mj4w9") pod "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" (UID: "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a"). InnerVolumeSpecName "kube-api-access-mj4w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.793512 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj4w9\" (UniqueName: \"kubernetes.io/projected/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-kube-api-access-mj4w9\") on node \"crc\" DevicePath \"\"" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.793550 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.817913 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" (UID: "a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 01:59:51 crc kubenswrapper[4759]: I1205 01:59:51.895683 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.008806 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerID="467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a" exitCode=0 Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.008852 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerDied","Data":"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a"} Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.008890 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjwnt" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.008906 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjwnt" event={"ID":"a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a","Type":"ContainerDied","Data":"f63633136a16578b7e043df718088df5fc5322ec457c5e61594df5163fd08cce"} Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.008938 4759 scope.go:117] "RemoveContainer" containerID="467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.032248 4759 scope.go:117] "RemoveContainer" containerID="8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.053635 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.068074 4759 scope.go:117] "RemoveContainer" containerID="21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.069157 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cjwnt"] Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.114416 4759 scope.go:117] "RemoveContainer" containerID="467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a" Dec 05 01:59:52 crc kubenswrapper[4759]: E1205 01:59:52.114929 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a\": container with ID starting with 467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a not found: ID does not exist" containerID="467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.114975 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a"} err="failed to get container status \"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a\": rpc error: code = NotFound desc = could not find container \"467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a\": container with ID starting with 467cbf65eaba3c75073da74745e7c236d57a6577611cb8fe0d2055fe70032e0a not found: ID does not exist" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.115000 4759 scope.go:117] "RemoveContainer" containerID="8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b" Dec 05 01:59:52 crc kubenswrapper[4759]: E1205 01:59:52.115714 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b\": container with ID starting with 8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b not found: ID does not exist" containerID="8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.115772 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b"} err="failed to get container status \"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b\": rpc error: code = NotFound desc = could not find container \"8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b\": container with ID starting with 8ff9ca96d421618953b5d69cb8948a182f9d2ff786c82e561d4b25316f83320b not found: ID does not exist" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.115810 4759 scope.go:117] "RemoveContainer" containerID="21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807" Dec 05 01:59:52 crc kubenswrapper[4759]: E1205 01:59:52.116362 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807\": container with ID starting with 21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807 not found: ID does not exist" containerID="21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807" Dec 05 01:59:52 crc kubenswrapper[4759]: I1205 01:59:52.116443 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807"} err="failed to get container status \"21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807\": rpc error: code = NotFound desc = could not find container \"21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807\": container with ID starting with 21ce73e9926df5cae7592bdbd2b4a77d3ee3a8b66349fff935a73d0a3dd59807 not found: ID does not exist" Dec 05 01:59:53 crc kubenswrapper[4759]: I1205 01:59:53.167602 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" path="/var/lib/kubelet/pods/a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a/volumes" Dec 05 01:59:59 crc kubenswrapper[4759]: I1205 01:59:59.155859 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 01:59:59 crc kubenswrapper[4759]: E1205 01:59:59.156813 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.166220 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l"] Dec 05 02:00:00 crc kubenswrapper[4759]: E1205 02:00:00.167846 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="extract-utilities" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.167941 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="extract-utilities" Dec 05 02:00:00 crc kubenswrapper[4759]: E1205 02:00:00.168055 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="registry-server" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.168121 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="registry-server" Dec 05 02:00:00 crc kubenswrapper[4759]: E1205 02:00:00.168198 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="extract-content" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.168253 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="extract-content" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.168551 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7867cc4-d38e-4ce2-b742-1f9a9d2bf02a" containerName="registry-server" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.169467 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.173788 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.173799 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.179654 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l"] Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.240850 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.241156 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.241918 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmhb\" (UniqueName: \"kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.343579 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.343689 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.343763 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmhb\" (UniqueName: \"kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.344915 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.352126 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.372097 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmhb\" (UniqueName: \"kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb\") pod \"collect-profiles-29415000-zct4l\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.501387 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:00 crc kubenswrapper[4759]: I1205 02:00:00.977853 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l"] Dec 05 02:00:01 crc kubenswrapper[4759]: I1205 02:00:01.172356 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" event={"ID":"702f6722-3769-4ac1-b98f-4b29a2001133","Type":"ContainerStarted","Data":"448da3b8ff548aefa74823b095ed396591ea9b6fefd21e44041b4a4250a653d7"} Dec 05 02:00:02 crc kubenswrapper[4759]: I1205 02:00:02.219970 4759 generic.go:334] "Generic (PLEG): container finished" podID="702f6722-3769-4ac1-b98f-4b29a2001133" containerID="80cd4fd957cba07397718e0988db6c83370a81e26f18f393f202a78652e86e5d" exitCode=0 Dec 05 02:00:02 crc kubenswrapper[4759]: I1205 02:00:02.220395 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" event={"ID":"702f6722-3769-4ac1-b98f-4b29a2001133","Type":"ContainerDied","Data":"80cd4fd957cba07397718e0988db6c83370a81e26f18f393f202a78652e86e5d"} Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.784526 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.945876 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume\") pod \"702f6722-3769-4ac1-b98f-4b29a2001133\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.946290 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume\") pod \"702f6722-3769-4ac1-b98f-4b29a2001133\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.946389 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmhb\" (UniqueName: \"kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb\") pod \"702f6722-3769-4ac1-b98f-4b29a2001133\" (UID: \"702f6722-3769-4ac1-b98f-4b29a2001133\") " Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.946771 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume" (OuterVolumeSpecName: "config-volume") pod "702f6722-3769-4ac1-b98f-4b29a2001133" (UID: "702f6722-3769-4ac1-b98f-4b29a2001133"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.946908 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702f6722-3769-4ac1-b98f-4b29a2001133-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.953699 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb" (OuterVolumeSpecName: "kube-api-access-mdmhb") pod "702f6722-3769-4ac1-b98f-4b29a2001133" (UID: "702f6722-3769-4ac1-b98f-4b29a2001133"). InnerVolumeSpecName "kube-api-access-mdmhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:00:03 crc kubenswrapper[4759]: I1205 02:00:03.954547 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "702f6722-3769-4ac1-b98f-4b29a2001133" (UID: "702f6722-3769-4ac1-b98f-4b29a2001133"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.049066 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702f6722-3769-4ac1-b98f-4b29a2001133-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.049097 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdmhb\" (UniqueName: \"kubernetes.io/projected/702f6722-3769-4ac1-b98f-4b29a2001133-kube-api-access-mdmhb\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.257085 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" event={"ID":"702f6722-3769-4ac1-b98f-4b29a2001133","Type":"ContainerDied","Data":"448da3b8ff548aefa74823b095ed396591ea9b6fefd21e44041b4a4250a653d7"} Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.257227 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448da3b8ff548aefa74823b095ed396591ea9b6fefd21e44041b4a4250a653d7" Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.257158 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l" Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.863882 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6"] Dec 05 02:00:04 crc kubenswrapper[4759]: I1205 02:00:04.877665 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414955-mcnw6"] Dec 05 02:00:05 crc kubenswrapper[4759]: I1205 02:00:05.173627 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6b9c8c-896e-421a-9704-5e96eece6da1" path="/var/lib/kubelet/pods/8a6b9c8c-896e-421a-9704-5e96eece6da1/volumes" Dec 05 02:00:12 crc kubenswrapper[4759]: I1205 02:00:12.156738 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:00:12 crc kubenswrapper[4759]: E1205 02:00:12.157578 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:00:25 crc kubenswrapper[4759]: I1205 02:00:25.156486 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:00:25 crc kubenswrapper[4759]: E1205 02:00:25.157957 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:00:36 crc kubenswrapper[4759]: I1205 02:00:36.255859 4759 scope.go:117] "RemoveContainer" containerID="9a9a63dddaf02b72e93eaa2a6a61152734c1b3b62db1b21469670b56fcc09018" Dec 05 02:00:37 crc kubenswrapper[4759]: I1205 02:00:37.157904 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:00:37 crc kubenswrapper[4759]: E1205 02:00:37.158596 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.866022 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:39 crc kubenswrapper[4759]: E1205 02:00:39.867512 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702f6722-3769-4ac1-b98f-4b29a2001133" containerName="collect-profiles" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.867537 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="702f6722-3769-4ac1-b98f-4b29a2001133" containerName="collect-profiles" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.867940 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="702f6722-3769-4ac1-b98f-4b29a2001133" containerName="collect-profiles" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.871136 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.882330 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.926062 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgls\" (UniqueName: \"kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.926169 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:39 crc kubenswrapper[4759]: I1205 02:00:39.926395 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.028501 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgls\" (UniqueName: \"kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.028617 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.028693 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.029246 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.029643 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.053808 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgls\" (UniqueName: \"kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls\") pod \"certified-operators-4646m\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.220095 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:40 crc kubenswrapper[4759]: W1205 02:00:40.760230 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c76f521_33bd_4474_8366_0f59278740f7.slice/crio-674be36a5d567ea6b86b58fb16e6c5536608a54948903a55a75fbeaa34cbebd2 WatchSource:0}: Error finding container 674be36a5d567ea6b86b58fb16e6c5536608a54948903a55a75fbeaa34cbebd2: Status 404 returned error can't find the container with id 674be36a5d567ea6b86b58fb16e6c5536608a54948903a55a75fbeaa34cbebd2 Dec 05 02:00:40 crc kubenswrapper[4759]: I1205 02:00:40.760419 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:41 crc kubenswrapper[4759]: I1205 02:00:41.737338 4759 generic.go:334] "Generic (PLEG): container finished" podID="0c76f521-33bd-4474-8366-0f59278740f7" containerID="80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b" exitCode=0 Dec 05 02:00:41 crc kubenswrapper[4759]: I1205 02:00:41.737458 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerDied","Data":"80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b"} Dec 05 02:00:41 crc kubenswrapper[4759]: I1205 02:00:41.737988 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerStarted","Data":"674be36a5d567ea6b86b58fb16e6c5536608a54948903a55a75fbeaa34cbebd2"} Dec 05 02:00:43 crc kubenswrapper[4759]: I1205 02:00:43.762027 4759 generic.go:334] "Generic (PLEG): container finished" podID="0c76f521-33bd-4474-8366-0f59278740f7" containerID="70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d" exitCode=0 Dec 05 02:00:43 crc kubenswrapper[4759]: I1205 02:00:43.762118 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerDied","Data":"70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d"} Dec 05 02:00:45 crc kubenswrapper[4759]: I1205 02:00:45.786608 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerStarted","Data":"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a"} Dec 05 02:00:45 crc kubenswrapper[4759]: I1205 02:00:45.804837 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4646m" podStartSLOduration=4.164988145 podStartE2EDuration="6.804815012s" podCreationTimestamp="2025-12-05 02:00:39 +0000 UTC" firstStartedPulling="2025-12-05 02:00:41.739796912 +0000 UTC m=+5860.955457872" lastFinishedPulling="2025-12-05 02:00:44.379623789 +0000 UTC m=+5863.595284739" observedRunningTime="2025-12-05 02:00:45.803800427 +0000 UTC m=+5865.019461377" watchObservedRunningTime="2025-12-05 02:00:45.804815012 +0000 UTC m=+5865.020475972" Dec 05 02:00:49 crc kubenswrapper[4759]: I1205 02:00:49.156218 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:00:49 crc kubenswrapper[4759]: E1205 02:00:49.157713 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:00:50 crc kubenswrapper[4759]: I1205 02:00:50.220929 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:50 crc kubenswrapper[4759]: I1205 02:00:50.220969 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:50 crc kubenswrapper[4759]: I1205 02:00:50.268131 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:50 crc kubenswrapper[4759]: I1205 02:00:50.909817 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:50 crc kubenswrapper[4759]: I1205 02:00:50.962822 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:52 crc kubenswrapper[4759]: I1205 02:00:52.886510 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4646m" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="registry-server" containerID="cri-o://e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a" gracePeriod=2 Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.483887 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.563301 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgls\" (UniqueName: \"kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls\") pod \"0c76f521-33bd-4474-8366-0f59278740f7\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.563358 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content\") pod \"0c76f521-33bd-4474-8366-0f59278740f7\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.563722 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities\") pod \"0c76f521-33bd-4474-8366-0f59278740f7\" (UID: \"0c76f521-33bd-4474-8366-0f59278740f7\") " Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.564467 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities" (OuterVolumeSpecName: "utilities") pod "0c76f521-33bd-4474-8366-0f59278740f7" (UID: "0c76f521-33bd-4474-8366-0f59278740f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.571694 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls" (OuterVolumeSpecName: "kube-api-access-7wgls") pod "0c76f521-33bd-4474-8366-0f59278740f7" (UID: "0c76f521-33bd-4474-8366-0f59278740f7"). InnerVolumeSpecName "kube-api-access-7wgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.620692 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c76f521-33bd-4474-8366-0f59278740f7" (UID: "0c76f521-33bd-4474-8366-0f59278740f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.665990 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgls\" (UniqueName: \"kubernetes.io/projected/0c76f521-33bd-4474-8366-0f59278740f7-kube-api-access-7wgls\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.666029 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.666041 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c76f521-33bd-4474-8366-0f59278740f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.907947 4759 generic.go:334] "Generic (PLEG): container finished" podID="0c76f521-33bd-4474-8366-0f59278740f7" containerID="e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a" exitCode=0 Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.908074 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerDied","Data":"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a"} Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.908284 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4646m" event={"ID":"0c76f521-33bd-4474-8366-0f59278740f7","Type":"ContainerDied","Data":"674be36a5d567ea6b86b58fb16e6c5536608a54948903a55a75fbeaa34cbebd2"} Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.908199 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4646m" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.908362 4759 scope.go:117] "RemoveContainer" containerID="e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.941726 4759 scope.go:117] "RemoveContainer" containerID="70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d" Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.981364 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:53 crc kubenswrapper[4759]: I1205 02:00:53.998391 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4646m"] Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.000899 4759 scope.go:117] "RemoveContainer" containerID="80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.068557 4759 scope.go:117] "RemoveContainer" containerID="e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a" Dec 05 02:00:54 crc kubenswrapper[4759]: E1205 02:00:54.069805 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a\": container with ID starting with e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a not found: ID does not exist" containerID="e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.069847 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a"} err="failed to get container status \"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a\": rpc error: code = NotFound desc = could not find container \"e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a\": container with ID starting with e7f7bf3fc84514eaa5ba383a207d142e79ca93caef5bbea487b3cf32b6db415a not found: ID does not exist" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.069868 4759 scope.go:117] "RemoveContainer" containerID="70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d" Dec 05 02:00:54 crc kubenswrapper[4759]: E1205 02:00:54.070245 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d\": container with ID starting with 70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d not found: ID does not exist" containerID="70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.070266 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d"} err="failed to get container status \"70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d\": rpc error: code = NotFound desc = could not find container \"70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d\": container with ID starting with 70c8a437a30cde49fc4a280bc6d094799f2ed949d7a12ca0194ef81d3db7c02d not found: ID does not exist" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.070279 4759 scope.go:117] "RemoveContainer" containerID="80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b" Dec 05 02:00:54 crc kubenswrapper[4759]: E1205 02:00:54.070684 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b\": container with ID starting with 80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b not found: ID does not exist" containerID="80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b" Dec 05 02:00:54 crc kubenswrapper[4759]: I1205 02:00:54.070710 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b"} err="failed to get container status \"80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b\": rpc error: code = NotFound desc = could not find container \"80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b\": container with ID starting with 80f3d59d311f7749d2ee72465742567377219639e55aa129dfca2d9f8e03774b not found: ID does not exist" Dec 05 02:00:55 crc kubenswrapper[4759]: I1205 02:00:55.172720 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c76f521-33bd-4474-8366-0f59278740f7" path="/var/lib/kubelet/pods/0c76f521-33bd-4474-8366-0f59278740f7/volumes" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.194886 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415001-qdv8x"] Dec 05 02:01:00 crc kubenswrapper[4759]: E1205 02:01:00.195851 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="registry-server" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.195866 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="registry-server" Dec 05 02:01:00 crc kubenswrapper[4759]: E1205 02:01:00.195921 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="extract-utilities" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.195930 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="extract-utilities" Dec 05 02:01:00 crc kubenswrapper[4759]: E1205 02:01:00.195952 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="extract-content" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.195959 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="extract-content" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.196211 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c76f521-33bd-4474-8366-0f59278740f7" containerName="registry-server" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.197137 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.223181 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415001-qdv8x"] Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.365283 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.365513 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.365843 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmxwj\" (UniqueName: \"kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.366217 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.469627 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.469796 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmxwj\" (UniqueName: \"kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.469855 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.469895 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.479105 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.479204 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.483254 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.496681 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmxwj\" (UniqueName: \"kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj\") pod \"keystone-cron-29415001-qdv8x\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:00 crc kubenswrapper[4759]: I1205 02:01:00.538985 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:01 crc kubenswrapper[4759]: I1205 02:01:01.027482 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415001-qdv8x"] Dec 05 02:01:01 crc kubenswrapper[4759]: I1205 02:01:01.995273 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415001-qdv8x" event={"ID":"f40db81b-0573-4c38-9382-5c23ef6cde76","Type":"ContainerStarted","Data":"a4481b9aa3f59229c8266d759f94529fd16808243b291500e32f41e34bee61a4"} Dec 05 02:01:01 crc kubenswrapper[4759]: I1205 02:01:01.995862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415001-qdv8x" event={"ID":"f40db81b-0573-4c38-9382-5c23ef6cde76","Type":"ContainerStarted","Data":"c6c27badb96970d27de57dab606e3523846fd04b343f5024e4430e43d4b318ce"} Dec 05 02:01:02 crc kubenswrapper[4759]: I1205 02:01:02.020005 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415001-qdv8x" podStartSLOduration=2.019989146 podStartE2EDuration="2.019989146s" podCreationTimestamp="2025-12-05 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 02:01:02.016418648 +0000 UTC m=+5881.232079598" watchObservedRunningTime="2025-12-05 02:01:02.019989146 +0000 UTC m=+5881.235650096" Dec 05 02:01:03 crc kubenswrapper[4759]: I1205 02:01:03.163532 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:01:03 crc kubenswrapper[4759]: E1205 02:01:03.164512 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:01:04 crc kubenswrapper[4759]: I1205 02:01:04.031436 4759 generic.go:334] "Generic (PLEG): container finished" podID="f40db81b-0573-4c38-9382-5c23ef6cde76" containerID="a4481b9aa3f59229c8266d759f94529fd16808243b291500e32f41e34bee61a4" exitCode=0 Dec 05 02:01:04 crc kubenswrapper[4759]: I1205 02:01:04.031541 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415001-qdv8x" event={"ID":"f40db81b-0573-4c38-9382-5c23ef6cde76","Type":"ContainerDied","Data":"a4481b9aa3f59229c8266d759f94529fd16808243b291500e32f41e34bee61a4"} Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.494452 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.608462 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmxwj\" (UniqueName: \"kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj\") pod \"f40db81b-0573-4c38-9382-5c23ef6cde76\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.608565 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle\") pod \"f40db81b-0573-4c38-9382-5c23ef6cde76\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.608650 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data\") pod \"f40db81b-0573-4c38-9382-5c23ef6cde76\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.608795 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys\") pod \"f40db81b-0573-4c38-9382-5c23ef6cde76\" (UID: \"f40db81b-0573-4c38-9382-5c23ef6cde76\") " Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.614340 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f40db81b-0573-4c38-9382-5c23ef6cde76" (UID: "f40db81b-0573-4c38-9382-5c23ef6cde76"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.615826 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj" (OuterVolumeSpecName: "kube-api-access-dmxwj") pod "f40db81b-0573-4c38-9382-5c23ef6cde76" (UID: "f40db81b-0573-4c38-9382-5c23ef6cde76"). InnerVolumeSpecName "kube-api-access-dmxwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.638783 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f40db81b-0573-4c38-9382-5c23ef6cde76" (UID: "f40db81b-0573-4c38-9382-5c23ef6cde76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.684697 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data" (OuterVolumeSpecName: "config-data") pod "f40db81b-0573-4c38-9382-5c23ef6cde76" (UID: "f40db81b-0573-4c38-9382-5c23ef6cde76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.712616 4759 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.712963 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmxwj\" (UniqueName: \"kubernetes.io/projected/f40db81b-0573-4c38-9382-5c23ef6cde76-kube-api-access-dmxwj\") on node \"crc\" DevicePath \"\"" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.712987 4759 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 02:01:05 crc kubenswrapper[4759]: I1205 02:01:05.713006 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f40db81b-0573-4c38-9382-5c23ef6cde76-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 02:01:06 crc kubenswrapper[4759]: I1205 02:01:06.057236 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415001-qdv8x" event={"ID":"f40db81b-0573-4c38-9382-5c23ef6cde76","Type":"ContainerDied","Data":"c6c27badb96970d27de57dab606e3523846fd04b343f5024e4430e43d4b318ce"} Dec 05 02:01:06 crc kubenswrapper[4759]: I1205 02:01:06.057279 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c27badb96970d27de57dab606e3523846fd04b343f5024e4430e43d4b318ce" Dec 05 02:01:06 crc kubenswrapper[4759]: I1205 02:01:06.057365 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415001-qdv8x" Dec 05 02:01:18 crc kubenswrapper[4759]: I1205 02:01:18.156442 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:01:18 crc kubenswrapper[4759]: E1205 02:01:18.157508 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:01:33 crc kubenswrapper[4759]: I1205 02:01:33.156054 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:01:33 crc kubenswrapper[4759]: E1205 02:01:33.157269 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:01:48 crc kubenswrapper[4759]: I1205 02:01:48.156778 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:01:48 crc kubenswrapper[4759]: E1205 02:01:48.159580 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:02:01 crc kubenswrapper[4759]: I1205 02:02:01.166595 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:02:01 crc kubenswrapper[4759]: E1205 02:02:01.167544 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:02:14 crc kubenswrapper[4759]: I1205 02:02:14.156375 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:02:14 crc kubenswrapper[4759]: E1205 02:02:14.157631 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:02:25 crc kubenswrapper[4759]: I1205 02:02:25.157676 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:02:25 crc kubenswrapper[4759]: E1205 02:02:25.158453 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:02:36 crc kubenswrapper[4759]: I1205 02:02:36.156014 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:02:36 crc kubenswrapper[4759]: I1205 02:02:36.769797 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5"} Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.188363 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:39 crc kubenswrapper[4759]: E1205 02:03:39.189249 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40db81b-0573-4c38-9382-5c23ef6cde76" containerName="keystone-cron" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.189262 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40db81b-0573-4c38-9382-5c23ef6cde76" containerName="keystone-cron" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.189513 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f40db81b-0573-4c38-9382-5c23ef6cde76" containerName="keystone-cron" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.191034 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.218294 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.300045 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.300082 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr48w\" (UniqueName: \"kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.300425 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.403344 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.403946 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.404544 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.404565 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr48w\" (UniqueName: \"kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.404909 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.428791 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr48w\" (UniqueName: \"kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w\") pod \"community-operators-jqrlm\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:39 crc kubenswrapper[4759]: I1205 02:03:39.520800 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:40 crc kubenswrapper[4759]: I1205 02:03:40.159867 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:40 crc kubenswrapper[4759]: I1205 02:03:40.622509 4759 generic.go:334] "Generic (PLEG): container finished" podID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerID="d464a3809cf46b36e578f03976e820b182c983c47dfcd237262600b150b87d07" exitCode=0 Dec 05 02:03:40 crc kubenswrapper[4759]: I1205 02:03:40.622571 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerDied","Data":"d464a3809cf46b36e578f03976e820b182c983c47dfcd237262600b150b87d07"} Dec 05 02:03:40 crc kubenswrapper[4759]: I1205 02:03:40.622838 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerStarted","Data":"0afc063244064c555a0c3198c2def1f0c5e88e026d022c9e07313fbbbe7620a9"} Dec 05 02:03:40 crc kubenswrapper[4759]: I1205 02:03:40.624899 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:03:41 crc kubenswrapper[4759]: I1205 02:03:41.638858 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerStarted","Data":"37a71e120ec1f18564c5b50f54529f7e43094d902d3f411e007b5b5d4bc6f46c"} Dec 05 02:03:42 crc kubenswrapper[4759]: I1205 02:03:42.651612 4759 generic.go:334] "Generic (PLEG): container finished" podID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerID="37a71e120ec1f18564c5b50f54529f7e43094d902d3f411e007b5b5d4bc6f46c" exitCode=0 Dec 05 02:03:42 crc kubenswrapper[4759]: I1205 02:03:42.651656 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerDied","Data":"37a71e120ec1f18564c5b50f54529f7e43094d902d3f411e007b5b5d4bc6f46c"} Dec 05 02:03:43 crc kubenswrapper[4759]: I1205 02:03:43.664729 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerStarted","Data":"a1e34b28ebe4f9baa52dfecc07075d61e58c69ab2a33d0750b910eb32549c310"} Dec 05 02:03:43 crc kubenswrapper[4759]: I1205 02:03:43.693514 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jqrlm" podStartSLOduration=2.220672326 podStartE2EDuration="4.693494561s" podCreationTimestamp="2025-12-05 02:03:39 +0000 UTC" firstStartedPulling="2025-12-05 02:03:40.624476234 +0000 UTC m=+6039.840137224" lastFinishedPulling="2025-12-05 02:03:43.097298509 +0000 UTC m=+6042.312959459" observedRunningTime="2025-12-05 02:03:43.683849162 +0000 UTC m=+6042.899510112" watchObservedRunningTime="2025-12-05 02:03:43.693494561 +0000 UTC m=+6042.909155511" Dec 05 02:03:49 crc kubenswrapper[4759]: I1205 02:03:49.522658 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:49 crc kubenswrapper[4759]: I1205 02:03:49.523303 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:49 crc kubenswrapper[4759]: I1205 02:03:49.582438 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:49 crc kubenswrapper[4759]: I1205 02:03:49.790297 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:49 crc kubenswrapper[4759]: I1205 02:03:49.852753 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:51 crc kubenswrapper[4759]: I1205 02:03:51.758046 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jqrlm" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="registry-server" containerID="cri-o://a1e34b28ebe4f9baa52dfecc07075d61e58c69ab2a33d0750b910eb32549c310" gracePeriod=2 Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.772975 4759 generic.go:334] "Generic (PLEG): container finished" podID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerID="a1e34b28ebe4f9baa52dfecc07075d61e58c69ab2a33d0750b910eb32549c310" exitCode=0 Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.773036 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerDied","Data":"a1e34b28ebe4f9baa52dfecc07075d61e58c69ab2a33d0750b910eb32549c310"} Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.773380 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrlm" event={"ID":"683ace5d-7bc0-4483-85f1-ab699da7f201","Type":"ContainerDied","Data":"0afc063244064c555a0c3198c2def1f0c5e88e026d022c9e07313fbbbe7620a9"} Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.773406 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afc063244064c555a0c3198c2def1f0c5e88e026d022c9e07313fbbbe7620a9" Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.796736 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.920569 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr48w\" (UniqueName: \"kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w\") pod \"683ace5d-7bc0-4483-85f1-ab699da7f201\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.920671 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content\") pod \"683ace5d-7bc0-4483-85f1-ab699da7f201\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.920809 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities\") pod \"683ace5d-7bc0-4483-85f1-ab699da7f201\" (UID: \"683ace5d-7bc0-4483-85f1-ab699da7f201\") " Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.921704 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities" (OuterVolumeSpecName: "utilities") pod "683ace5d-7bc0-4483-85f1-ab699da7f201" (UID: "683ace5d-7bc0-4483-85f1-ab699da7f201"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.927978 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w" (OuterVolumeSpecName: "kube-api-access-sr48w") pod "683ace5d-7bc0-4483-85f1-ab699da7f201" (UID: "683ace5d-7bc0-4483-85f1-ab699da7f201"). InnerVolumeSpecName "kube-api-access-sr48w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:03:52 crc kubenswrapper[4759]: I1205 02:03:52.980900 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "683ace5d-7bc0-4483-85f1-ab699da7f201" (UID: "683ace5d-7bc0-4483-85f1-ab699da7f201"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.023525 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr48w\" (UniqueName: \"kubernetes.io/projected/683ace5d-7bc0-4483-85f1-ab699da7f201-kube-api-access-sr48w\") on node \"crc\" DevicePath \"\"" Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.023569 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.023583 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683ace5d-7bc0-4483-85f1-ab699da7f201-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.790082 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrlm" Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.834231 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:53 crc kubenswrapper[4759]: I1205 02:03:53.852849 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jqrlm"] Dec 05 02:03:55 crc kubenswrapper[4759]: I1205 02:03:55.169085 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" path="/var/lib/kubelet/pods/683ace5d-7bc0-4483-85f1-ab699da7f201/volumes" Dec 05 02:05:04 crc kubenswrapper[4759]: I1205 02:05:04.433241 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:05:04 crc kubenswrapper[4759]: I1205 02:05:04.433886 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:05:34 crc kubenswrapper[4759]: I1205 02:05:34.433880 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:05:34 crc kubenswrapper[4759]: I1205 02:05:34.434564 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:06:04 crc kubenswrapper[4759]: I1205 02:06:04.433638 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:06:04 crc kubenswrapper[4759]: I1205 02:06:04.434381 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:06:04 crc kubenswrapper[4759]: I1205 02:06:04.434443 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:06:04 crc kubenswrapper[4759]: I1205 02:06:04.435769 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:06:04 crc kubenswrapper[4759]: I1205 02:06:04.435885 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5" gracePeriod=600 Dec 05 02:06:05 crc kubenswrapper[4759]: I1205 02:06:05.408814 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5" exitCode=0 Dec 05 02:06:05 crc kubenswrapper[4759]: I1205 02:06:05.408854 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5"} Dec 05 02:06:05 crc kubenswrapper[4759]: I1205 02:06:05.409371 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf"} Dec 05 02:06:05 crc kubenswrapper[4759]: I1205 02:06:05.409387 4759 scope.go:117] "RemoveContainer" containerID="0eddb8dffd12a078335ada23b877c28baf955f8b6499f9310ad61a29eb02df63" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.554483 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 02:06:33 crc kubenswrapper[4759]: E1205 02:06:33.557009 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="extract-content" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.557138 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="extract-content" Dec 05 02:06:33 crc kubenswrapper[4759]: E1205 02:06:33.557221 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="registry-server" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.557293 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="registry-server" Dec 05 02:06:33 crc kubenswrapper[4759]: E1205 02:06:33.560072 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="extract-utilities" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.560186 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="extract-utilities" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.560596 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="683ace5d-7bc0-4483-85f1-ab699da7f201" containerName="registry-server" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.561531 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.566220 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.566347 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m5tbt" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.566484 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.566985 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.572911 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665426 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665489 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665527 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665651 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665722 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.665912 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.666035 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwx9\" (UniqueName: \"kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.666096 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.666213 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768383 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768567 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768670 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768763 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwx9\" (UniqueName: \"kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768806 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.768941 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769181 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769248 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769333 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769608 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769984 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.771741 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.769976 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.773579 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.776623 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.776741 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.779769 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.800704 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwx9\" (UniqueName: \"kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.835163 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " pod="openstack/tempest-tests-tempest" Dec 05 02:06:33 crc kubenswrapper[4759]: I1205 02:06:33.882987 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 02:06:34 crc kubenswrapper[4759]: I1205 02:06:34.428743 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 02:06:34 crc kubenswrapper[4759]: I1205 02:06:34.771948 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"703704f3-2e29-4eed-8943-3a34a004d8fc","Type":"ContainerStarted","Data":"74d240ae37d1d13fe81f8ad63f62dd76aedd4e64c1aab300c00e38fc2954dcc0"} Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.143658 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.147430 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.184549 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlgh\" (UniqueName: \"kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.185133 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.185189 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.238719 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.289986 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.290199 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlgh\" (UniqueName: \"kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.290296 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.290872 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.291137 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.327029 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlgh\" (UniqueName: \"kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh\") pod \"redhat-marketplace-qvx9n\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:06:57 crc kubenswrapper[4759]: I1205 02:06:57.480491 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:09 crc kubenswrapper[4759]: E1205 02:07:09.248213 4759 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 05 02:07:09 crc kubenswrapper[4759]: E1205 02:07:09.249801 4759 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hqwx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(703704f3-2e29-4eed-8943-3a34a004d8fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 02:07:09 crc kubenswrapper[4759]: E1205 02:07:09.251033 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="703704f3-2e29-4eed-8943-3a34a004d8fc" Dec 05 02:07:09 crc kubenswrapper[4759]: I1205 02:07:09.637457 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:07:10 crc kubenswrapper[4759]: I1205 02:07:10.213263 4759 generic.go:334] "Generic (PLEG): container finished" podID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerID="98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698" exitCode=0 Dec 05 02:07:10 crc kubenswrapper[4759]: I1205 02:07:10.213391 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerDied","Data":"98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698"} Dec 05 02:07:10 crc kubenswrapper[4759]: I1205 02:07:10.214613 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerStarted","Data":"f0e39042b414ca6b67d7693f6a46573f28818a6a8f277a04c89041fd0fa4e54b"} Dec 05 02:07:10 crc kubenswrapper[4759]: E1205 02:07:10.215505 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="703704f3-2e29-4eed-8943-3a34a004d8fc" Dec 05 02:07:11 crc kubenswrapper[4759]: I1205 02:07:11.230923 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerStarted","Data":"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b"} Dec 05 02:07:12 crc kubenswrapper[4759]: I1205 02:07:12.250201 4759 generic.go:334] "Generic (PLEG): container finished" podID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerID="0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b" exitCode=0 Dec 05 02:07:12 crc kubenswrapper[4759]: I1205 02:07:12.250429 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerDied","Data":"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b"} Dec 05 02:07:13 crc kubenswrapper[4759]: I1205 02:07:13.265771 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerStarted","Data":"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5"} Dec 05 02:07:13 crc kubenswrapper[4759]: I1205 02:07:13.296479 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qvx9n" podStartSLOduration=13.792752162 podStartE2EDuration="16.29645855s" podCreationTimestamp="2025-12-05 02:06:57 +0000 UTC" firstStartedPulling="2025-12-05 02:07:10.215758379 +0000 UTC m=+6249.431419379" lastFinishedPulling="2025-12-05 02:07:12.719464787 +0000 UTC m=+6251.935125767" observedRunningTime="2025-12-05 02:07:13.284749786 +0000 UTC m=+6252.500410746" watchObservedRunningTime="2025-12-05 02:07:13.29645855 +0000 UTC m=+6252.512119500" Dec 05 02:07:17 crc kubenswrapper[4759]: I1205 02:07:17.481848 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:17 crc kubenswrapper[4759]: I1205 02:07:17.482480 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:17 crc kubenswrapper[4759]: I1205 02:07:17.541252 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:18 crc kubenswrapper[4759]: I1205 02:07:18.483891 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:18 crc kubenswrapper[4759]: I1205 02:07:18.543901 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:07:20 crc kubenswrapper[4759]: I1205 02:07:20.404120 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qvx9n" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="registry-server" containerID="cri-o://3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5" gracePeriod=2 Dec 05 02:07:20 crc kubenswrapper[4759]: I1205 02:07:20.948407 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.057110 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities\") pod \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.057456 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content\") pod \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.057562 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvlgh\" (UniqueName: \"kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh\") pod \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\" (UID: \"f20aabbd-fa71-4b89-9253-c90e9fe4bbef\") " Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.058522 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities" (OuterVolumeSpecName: "utilities") pod "f20aabbd-fa71-4b89-9253-c90e9fe4bbef" (UID: "f20aabbd-fa71-4b89-9253-c90e9fe4bbef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.063697 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh" (OuterVolumeSpecName: "kube-api-access-cvlgh") pod "f20aabbd-fa71-4b89-9253-c90e9fe4bbef" (UID: "f20aabbd-fa71-4b89-9253-c90e9fe4bbef"). InnerVolumeSpecName "kube-api-access-cvlgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.082248 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20aabbd-fa71-4b89-9253-c90e9fe4bbef" (UID: "f20aabbd-fa71-4b89-9253-c90e9fe4bbef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.159976 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.160286 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvlgh\" (UniqueName: \"kubernetes.io/projected/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-kube-api-access-cvlgh\") on node \"crc\" DevicePath \"\"" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.160369 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20aabbd-fa71-4b89-9253-c90e9fe4bbef-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.420868 4759 generic.go:334] "Generic (PLEG): container finished" podID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerID="3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5" exitCode=0 Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.420949 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvx9n" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.420953 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerDied","Data":"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5"} Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.421168 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvx9n" event={"ID":"f20aabbd-fa71-4b89-9253-c90e9fe4bbef","Type":"ContainerDied","Data":"f0e39042b414ca6b67d7693f6a46573f28818a6a8f277a04c89041fd0fa4e54b"} Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.421212 4759 scope.go:117] "RemoveContainer" containerID="3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.462021 4759 scope.go:117] "RemoveContainer" containerID="0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.464849 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.474600 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvx9n"] Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.504950 4759 scope.go:117] "RemoveContainer" containerID="98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.568530 4759 scope.go:117] "RemoveContainer" containerID="3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5" Dec 05 02:07:21 crc kubenswrapper[4759]: E1205 02:07:21.569087 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5\": container with ID starting with 3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5 not found: ID does not exist" containerID="3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.569137 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5"} err="failed to get container status \"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5\": rpc error: code = NotFound desc = could not find container \"3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5\": container with ID starting with 3a51212a7747c4186aa511a929c173351f5f368112b82b815505851962288db5 not found: ID does not exist" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.569170 4759 scope.go:117] "RemoveContainer" containerID="0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b" Dec 05 02:07:21 crc kubenswrapper[4759]: E1205 02:07:21.569840 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b\": container with ID starting with 0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b not found: ID does not exist" containerID="0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.569883 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b"} err="failed to get container status \"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b\": rpc error: code = NotFound desc = could not find container \"0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b\": container with ID starting with 0cde5c47d86c21a84152789721a1a7cf508cd9b0ca5901d54620658e4d77939b not found: ID does not exist" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.569916 4759 scope.go:117] "RemoveContainer" containerID="98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698" Dec 05 02:07:21 crc kubenswrapper[4759]: E1205 02:07:21.570432 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698\": container with ID starting with 98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698 not found: ID does not exist" containerID="98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698" Dec 05 02:07:21 crc kubenswrapper[4759]: I1205 02:07:21.570467 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698"} err="failed to get container status \"98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698\": rpc error: code = NotFound desc = could not find container \"98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698\": container with ID starting with 98ac08548b29e3e02508dee5921d4829435b57254887a6731e03fbd2c2f50698 not found: ID does not exist" Dec 05 02:07:22 crc kubenswrapper[4759]: I1205 02:07:22.602478 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 02:07:23 crc kubenswrapper[4759]: I1205 02:07:23.173707 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" path="/var/lib/kubelet/pods/f20aabbd-fa71-4b89-9253-c90e9fe4bbef/volumes" Dec 05 02:07:24 crc kubenswrapper[4759]: I1205 02:07:24.459817 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"703704f3-2e29-4eed-8943-3a34a004d8fc","Type":"ContainerStarted","Data":"3b21db198fa4b81f66dfa74870cce7ade64754b740837c0178dd2c4b0fdd390b"} Dec 05 02:07:24 crc kubenswrapper[4759]: I1205 02:07:24.506704 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.342984674 podStartE2EDuration="52.506671726s" podCreationTimestamp="2025-12-05 02:06:32 +0000 UTC" firstStartedPulling="2025-12-05 02:06:34.433160701 +0000 UTC m=+6213.648821641" lastFinishedPulling="2025-12-05 02:07:22.596847723 +0000 UTC m=+6261.812508693" observedRunningTime="2025-12-05 02:07:24.479849293 +0000 UTC m=+6263.695510243" watchObservedRunningTime="2025-12-05 02:07:24.506671726 +0000 UTC m=+6263.722332706" Dec 05 02:08:04 crc kubenswrapper[4759]: I1205 02:08:04.433263 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:08:04 crc kubenswrapper[4759]: I1205 02:08:04.433845 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:08:34 crc kubenswrapper[4759]: I1205 02:08:34.433131 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:08:34 crc kubenswrapper[4759]: I1205 02:08:34.433687 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.433096 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.434780 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.435247 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.436850 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.437098 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" gracePeriod=600 Dec 05 02:09:04 crc kubenswrapper[4759]: E1205 02:09:04.572259 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.699913 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf"} Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.699940 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" exitCode=0 Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.700560 4759 scope.go:117] "RemoveContainer" containerID="78298e8e313d4f6cde19baf1fdf1ba01c2c8b4525656a5d7f4521845f2b309a5" Dec 05 02:09:04 crc kubenswrapper[4759]: I1205 02:09:04.700910 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:09:04 crc kubenswrapper[4759]: E1205 02:09:04.701733 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:09:19 crc kubenswrapper[4759]: I1205 02:09:19.156482 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:09:19 crc kubenswrapper[4759]: E1205 02:09:19.157878 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:09:32 crc kubenswrapper[4759]: I1205 02:09:32.156552 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:09:32 crc kubenswrapper[4759]: E1205 02:09:32.157387 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:09:43 crc kubenswrapper[4759]: I1205 02:09:43.157050 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:09:43 crc kubenswrapper[4759]: E1205 02:09:43.157940 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:09:55 crc kubenswrapper[4759]: I1205 02:09:55.156234 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:09:55 crc kubenswrapper[4759]: E1205 02:09:55.156944 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.775722 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:06 crc kubenswrapper[4759]: E1205 02:10:06.781287 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="registry-server" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.781458 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="registry-server" Dec 05 02:10:06 crc kubenswrapper[4759]: E1205 02:10:06.781509 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="extract-content" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.781519 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="extract-content" Dec 05 02:10:06 crc kubenswrapper[4759]: E1205 02:10:06.781556 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="extract-utilities" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.781565 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="extract-utilities" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.782220 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20aabbd-fa71-4b89-9253-c90e9fe4bbef" containerName="registry-server" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.784801 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.796986 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.797174 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.797590 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsfb\" (UniqueName: \"kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.811450 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.898560 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsfb\" (UniqueName: \"kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.898647 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.898682 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.900802 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.901817 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:06 crc kubenswrapper[4759]: I1205 02:10:06.930033 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsfb\" (UniqueName: \"kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb\") pod \"redhat-operators-zvs4d\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:07 crc kubenswrapper[4759]: I1205 02:10:07.108918 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:07 crc kubenswrapper[4759]: I1205 02:10:07.746825 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:08 crc kubenswrapper[4759]: I1205 02:10:08.441395 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac510014-5593-46f1-ae38-519d91cf9178" containerID="c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c" exitCode=0 Dec 05 02:10:08 crc kubenswrapper[4759]: I1205 02:10:08.441622 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerDied","Data":"c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c"} Dec 05 02:10:08 crc kubenswrapper[4759]: I1205 02:10:08.442064 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerStarted","Data":"5303f33c556256b425684e6ca0154ec335bf1a76f96d2fa74520b2de0be2c319"} Dec 05 02:10:08 crc kubenswrapper[4759]: I1205 02:10:08.448477 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:10:09 crc kubenswrapper[4759]: I1205 02:10:09.156657 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:10:09 crc kubenswrapper[4759]: E1205 02:10:09.157555 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:10:09 crc kubenswrapper[4759]: I1205 02:10:09.459870 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerStarted","Data":"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5"} Dec 05 02:10:12 crc kubenswrapper[4759]: I1205 02:10:12.496330 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac510014-5593-46f1-ae38-519d91cf9178" containerID="76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5" exitCode=0 Dec 05 02:10:12 crc kubenswrapper[4759]: I1205 02:10:12.496437 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerDied","Data":"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5"} Dec 05 02:10:13 crc kubenswrapper[4759]: I1205 02:10:13.523804 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerStarted","Data":"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10"} Dec 05 02:10:13 crc kubenswrapper[4759]: I1205 02:10:13.563572 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvs4d" podStartSLOduration=3.126156901 podStartE2EDuration="7.561482763s" podCreationTimestamp="2025-12-05 02:10:06 +0000 UTC" firstStartedPulling="2025-12-05 02:10:08.446767675 +0000 UTC m=+6427.662428665" lastFinishedPulling="2025-12-05 02:10:12.882093577 +0000 UTC m=+6432.097754527" observedRunningTime="2025-12-05 02:10:13.549392948 +0000 UTC m=+6432.765053918" watchObservedRunningTime="2025-12-05 02:10:13.561482763 +0000 UTC m=+6432.777143713" Dec 05 02:10:17 crc kubenswrapper[4759]: I1205 02:10:17.110915 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:17 crc kubenswrapper[4759]: I1205 02:10:17.111541 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:18 crc kubenswrapper[4759]: I1205 02:10:18.156843 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvs4d" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" probeResult="failure" output=< Dec 05 02:10:18 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:10:18 crc kubenswrapper[4759]: > Dec 05 02:10:20 crc kubenswrapper[4759]: I1205 02:10:20.156376 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:10:20 crc kubenswrapper[4759]: E1205 02:10:20.157011 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:10:28 crc kubenswrapper[4759]: I1205 02:10:28.158392 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvs4d" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" probeResult="failure" output=< Dec 05 02:10:28 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:10:28 crc kubenswrapper[4759]: > Dec 05 02:10:31 crc kubenswrapper[4759]: I1205 02:10:31.165351 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:10:31 crc kubenswrapper[4759]: E1205 02:10:31.166014 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:10:37 crc kubenswrapper[4759]: I1205 02:10:36.999525 4759 scope.go:117] "RemoveContainer" containerID="a1e34b28ebe4f9baa52dfecc07075d61e58c69ab2a33d0750b910eb32549c310" Dec 05 02:10:37 crc kubenswrapper[4759]: I1205 02:10:37.061006 4759 scope.go:117] "RemoveContainer" containerID="37a71e120ec1f18564c5b50f54529f7e43094d902d3f411e007b5b5d4bc6f46c" Dec 05 02:10:37 crc kubenswrapper[4759]: I1205 02:10:37.107648 4759 scope.go:117] "RemoveContainer" containerID="d464a3809cf46b36e578f03976e820b182c983c47dfcd237262600b150b87d07" Dec 05 02:10:37 crc kubenswrapper[4759]: I1205 02:10:37.231323 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:37 crc kubenswrapper[4759]: I1205 02:10:37.288779 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:38 crc kubenswrapper[4759]: I1205 02:10:38.025938 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:38 crc kubenswrapper[4759]: I1205 02:10:38.790072 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvs4d" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" containerID="cri-o://05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10" gracePeriod=2 Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.783743 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.803156 4759 generic.go:334] "Generic (PLEG): container finished" podID="ac510014-5593-46f1-ae38-519d91cf9178" containerID="05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10" exitCode=0 Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.803197 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerDied","Data":"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10"} Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.803232 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvs4d" event={"ID":"ac510014-5593-46f1-ae38-519d91cf9178","Type":"ContainerDied","Data":"5303f33c556256b425684e6ca0154ec335bf1a76f96d2fa74520b2de0be2c319"} Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.803253 4759 scope.go:117] "RemoveContainer" containerID="05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.803339 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvs4d" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.859638 4759 scope.go:117] "RemoveContainer" containerID="76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.893030 4759 scope.go:117] "RemoveContainer" containerID="c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.898294 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpsfb\" (UniqueName: \"kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb\") pod \"ac510014-5593-46f1-ae38-519d91cf9178\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.898528 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content\") pod \"ac510014-5593-46f1-ae38-519d91cf9178\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.898733 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities\") pod \"ac510014-5593-46f1-ae38-519d91cf9178\" (UID: \"ac510014-5593-46f1-ae38-519d91cf9178\") " Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.901445 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities" (OuterVolumeSpecName: "utilities") pod "ac510014-5593-46f1-ae38-519d91cf9178" (UID: "ac510014-5593-46f1-ae38-519d91cf9178"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:10:39 crc kubenswrapper[4759]: I1205 02:10:39.916549 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb" (OuterVolumeSpecName: "kube-api-access-mpsfb") pod "ac510014-5593-46f1-ae38-519d91cf9178" (UID: "ac510014-5593-46f1-ae38-519d91cf9178"). InnerVolumeSpecName "kube-api-access-mpsfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.006839 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.006874 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpsfb\" (UniqueName: \"kubernetes.io/projected/ac510014-5593-46f1-ae38-519d91cf9178-kube-api-access-mpsfb\") on node \"crc\" DevicePath \"\"" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.009068 4759 scope.go:117] "RemoveContainer" containerID="05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10" Dec 05 02:10:40 crc kubenswrapper[4759]: E1205 02:10:40.010578 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10\": container with ID starting with 05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10 not found: ID does not exist" containerID="05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.010825 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10"} err="failed to get container status \"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10\": rpc error: code = NotFound desc = could not find container \"05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10\": container with ID starting with 05c2389d6f275da81b13b47dd0b5118682571549d7b4300bea3c06e1aab3cf10 not found: ID does not exist" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.010861 4759 scope.go:117] "RemoveContainer" containerID="76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5" Dec 05 02:10:40 crc kubenswrapper[4759]: E1205 02:10:40.016718 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5\": container with ID starting with 76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5 not found: ID does not exist" containerID="76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.016759 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5"} err="failed to get container status \"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5\": rpc error: code = NotFound desc = could not find container \"76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5\": container with ID starting with 76cbf50b52c7300bf28fddcd3158748663e3bc699970d4b0f076827d5c0f47f5 not found: ID does not exist" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.016780 4759 scope.go:117] "RemoveContainer" containerID="c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c" Dec 05 02:10:40 crc kubenswrapper[4759]: E1205 02:10:40.022412 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c\": container with ID starting with c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c not found: ID does not exist" containerID="c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.022455 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c"} err="failed to get container status \"c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c\": rpc error: code = NotFound desc = could not find container \"c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c\": container with ID starting with c9f7580c951a22f3c203260eb77e3f84aa63a3b5d51442672abceac32608016c not found: ID does not exist" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.049585 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac510014-5593-46f1-ae38-519d91cf9178" (UID: "ac510014-5593-46f1-ae38-519d91cf9178"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.108491 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac510014-5593-46f1-ae38-519d91cf9178-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.146950 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:40 crc kubenswrapper[4759]: I1205 02:10:40.158867 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvs4d"] Dec 05 02:10:40 crc kubenswrapper[4759]: E1205 02:10:40.241042 4759 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac510014_5593_46f1_ae38_519d91cf9178.slice/crio-5303f33c556256b425684e6ca0154ec335bf1a76f96d2fa74520b2de0be2c319\": RecentStats: unable to find data in memory cache]" Dec 05 02:10:41 crc kubenswrapper[4759]: I1205 02:10:41.169982 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac510014-5593-46f1-ae38-519d91cf9178" path="/var/lib/kubelet/pods/ac510014-5593-46f1-ae38-519d91cf9178/volumes" Dec 05 02:10:43 crc kubenswrapper[4759]: I1205 02:10:43.159061 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:10:43 crc kubenswrapper[4759]: E1205 02:10:43.160420 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:10:55 crc kubenswrapper[4759]: I1205 02:10:55.164760 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:10:55 crc kubenswrapper[4759]: E1205 02:10:55.167813 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:08 crc kubenswrapper[4759]: I1205 02:11:08.156245 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:11:08 crc kubenswrapper[4759]: E1205 02:11:08.157139 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:19 crc kubenswrapper[4759]: I1205 02:11:19.156074 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:11:19 crc kubenswrapper[4759]: E1205 02:11:19.157236 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:33 crc kubenswrapper[4759]: I1205 02:11:33.155402 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:11:33 crc kubenswrapper[4759]: E1205 02:11:33.156008 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.156964 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:11:44 crc kubenswrapper[4759]: E1205 02:11:44.158081 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.385153 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:44 crc kubenswrapper[4759]: E1205 02:11:44.386645 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="extract-content" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.386678 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="extract-content" Dec 05 02:11:44 crc kubenswrapper[4759]: E1205 02:11:44.386705 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.386714 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" Dec 05 02:11:44 crc kubenswrapper[4759]: E1205 02:11:44.386762 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="extract-utilities" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.386772 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="extract-utilities" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.387283 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac510014-5593-46f1-ae38-519d91cf9178" containerName="registry-server" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.390087 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.406061 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.487001 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.487223 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tw6\" (UniqueName: \"kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.487282 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.589854 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tw6\" (UniqueName: \"kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.589940 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.590082 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.590688 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.590791 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.614281 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tw6\" (UniqueName: \"kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6\") pod \"certified-operators-d58f9\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:44 crc kubenswrapper[4759]: I1205 02:11:44.729256 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:45 crc kubenswrapper[4759]: I1205 02:11:45.280794 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:45 crc kubenswrapper[4759]: I1205 02:11:45.734167 4759 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerID="aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0" exitCode=0 Dec 05 02:11:45 crc kubenswrapper[4759]: I1205 02:11:45.734353 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerDied","Data":"aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0"} Dec 05 02:11:45 crc kubenswrapper[4759]: I1205 02:11:45.734550 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerStarted","Data":"2e1da804fb3f3518937a1cc256108007bfd5a8de5785a7e0594aa7bfec184b66"} Dec 05 02:11:46 crc kubenswrapper[4759]: I1205 02:11:46.747093 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerStarted","Data":"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423"} Dec 05 02:11:47 crc kubenswrapper[4759]: I1205 02:11:47.758379 4759 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerID="ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423" exitCode=0 Dec 05 02:11:47 crc kubenswrapper[4759]: I1205 02:11:47.758474 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerDied","Data":"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423"} Dec 05 02:11:48 crc kubenswrapper[4759]: I1205 02:11:48.770970 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerStarted","Data":"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8"} Dec 05 02:11:54 crc kubenswrapper[4759]: I1205 02:11:54.730210 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:54 crc kubenswrapper[4759]: I1205 02:11:54.733590 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:54 crc kubenswrapper[4759]: I1205 02:11:54.781565 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:54 crc kubenswrapper[4759]: I1205 02:11:54.808434 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d58f9" podStartSLOduration=8.398028704 podStartE2EDuration="10.80794282s" podCreationTimestamp="2025-12-05 02:11:44 +0000 UTC" firstStartedPulling="2025-12-05 02:11:45.736424608 +0000 UTC m=+6524.952085558" lastFinishedPulling="2025-12-05 02:11:48.146338704 +0000 UTC m=+6527.361999674" observedRunningTime="2025-12-05 02:11:48.811073682 +0000 UTC m=+6528.026734632" watchObservedRunningTime="2025-12-05 02:11:54.80794282 +0000 UTC m=+6534.023603780" Dec 05 02:11:54 crc kubenswrapper[4759]: I1205 02:11:54.884502 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:55 crc kubenswrapper[4759]: I1205 02:11:55.023287 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:55 crc kubenswrapper[4759]: I1205 02:11:55.155916 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:11:55 crc kubenswrapper[4759]: E1205 02:11:55.156266 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:11:56 crc kubenswrapper[4759]: I1205 02:11:56.853803 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d58f9" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="registry-server" containerID="cri-o://be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8" gracePeriod=2 Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.616100 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.701794 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6tw6\" (UniqueName: \"kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6\") pod \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.701872 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content\") pod \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.701940 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities\") pod \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\" (UID: \"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8\") " Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.702859 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities" (OuterVolumeSpecName: "utilities") pod "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" (UID: "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.707885 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6" (OuterVolumeSpecName: "kube-api-access-z6tw6") pod "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" (UID: "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8"). InnerVolumeSpecName "kube-api-access-z6tw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.752231 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" (UID: "2e9b0291-a6e8-4e18-a915-a7a3db4a14d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.804989 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6tw6\" (UniqueName: \"kubernetes.io/projected/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-kube-api-access-z6tw6\") on node \"crc\" DevicePath \"\"" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.805265 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.805342 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.865374 4759 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerID="be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8" exitCode=0 Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.865445 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d58f9" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.865449 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerDied","Data":"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8"} Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.866564 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d58f9" event={"ID":"2e9b0291-a6e8-4e18-a915-a7a3db4a14d8","Type":"ContainerDied","Data":"2e1da804fb3f3518937a1cc256108007bfd5a8de5785a7e0594aa7bfec184b66"} Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.866591 4759 scope.go:117] "RemoveContainer" containerID="be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.897224 4759 scope.go:117] "RemoveContainer" containerID="ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.915630 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.928573 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d58f9"] Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.939701 4759 scope.go:117] "RemoveContainer" containerID="aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.982257 4759 scope.go:117] "RemoveContainer" containerID="be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8" Dec 05 02:11:57 crc kubenswrapper[4759]: E1205 02:11:57.982816 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8\": container with ID starting with be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8 not found: ID does not exist" containerID="be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.982922 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8"} err="failed to get container status \"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8\": rpc error: code = NotFound desc = could not find container \"be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8\": container with ID starting with be1b06b2737e5899a5bdfa2eb1d85f412ff9b487949ec0e87da90d5548a122d8 not found: ID does not exist" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.982997 4759 scope.go:117] "RemoveContainer" containerID="ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423" Dec 05 02:11:57 crc kubenswrapper[4759]: E1205 02:11:57.983558 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423\": container with ID starting with ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423 not found: ID does not exist" containerID="ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.983585 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423"} err="failed to get container status \"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423\": rpc error: code = NotFound desc = could not find container \"ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423\": container with ID starting with ae1afc9339a5641b95659fa4c37d4dba2ca802ac1e4618c9a0369b0c5a5da423 not found: ID does not exist" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.983599 4759 scope.go:117] "RemoveContainer" containerID="aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0" Dec 05 02:11:57 crc kubenswrapper[4759]: E1205 02:11:57.983989 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0\": container with ID starting with aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0 not found: ID does not exist" containerID="aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0" Dec 05 02:11:57 crc kubenswrapper[4759]: I1205 02:11:57.984010 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0"} err="failed to get container status \"aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0\": rpc error: code = NotFound desc = could not find container \"aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0\": container with ID starting with aee22a4910c8cc0c0cd7d8b6cb428e280494390fad68f4721c4b8381f1e599a0 not found: ID does not exist" Dec 05 02:11:59 crc kubenswrapper[4759]: I1205 02:11:59.200069 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" path="/var/lib/kubelet/pods/2e9b0291-a6e8-4e18-a915-a7a3db4a14d8/volumes" Dec 05 02:12:07 crc kubenswrapper[4759]: I1205 02:12:07.156129 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:12:07 crc kubenswrapper[4759]: E1205 02:12:07.157202 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:12:18 crc kubenswrapper[4759]: I1205 02:12:18.156911 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:12:18 crc kubenswrapper[4759]: E1205 02:12:18.158413 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:12:30 crc kubenswrapper[4759]: I1205 02:12:30.155585 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:12:30 crc kubenswrapper[4759]: E1205 02:12:30.156254 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:12:44 crc kubenswrapper[4759]: I1205 02:12:44.156507 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:12:44 crc kubenswrapper[4759]: E1205 02:12:44.157410 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:12:58 crc kubenswrapper[4759]: I1205 02:12:58.156135 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:12:58 crc kubenswrapper[4759]: E1205 02:12:58.157158 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:10 crc kubenswrapper[4759]: I1205 02:13:10.155533 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:13:10 crc kubenswrapper[4759]: E1205 02:13:10.156412 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:21 crc kubenswrapper[4759]: I1205 02:13:21.163405 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:13:21 crc kubenswrapper[4759]: E1205 02:13:21.164093 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:34 crc kubenswrapper[4759]: I1205 02:13:34.156960 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:13:34 crc kubenswrapper[4759]: E1205 02:13:34.157736 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.644432 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:44 crc kubenswrapper[4759]: E1205 02:13:44.645636 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="extract-content" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.645656 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="extract-content" Dec 05 02:13:44 crc kubenswrapper[4759]: E1205 02:13:44.645683 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="extract-utilities" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.645691 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="extract-utilities" Dec 05 02:13:44 crc kubenswrapper[4759]: E1205 02:13:44.645743 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="registry-server" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.645752 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="registry-server" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.646068 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e9b0291-a6e8-4e18-a915-a7a3db4a14d8" containerName="registry-server" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.648095 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.658367 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.773996 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.774064 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.774177 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpsj\" (UniqueName: \"kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.876730 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.876793 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.876840 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpsj\" (UniqueName: \"kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.877991 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.878398 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.896935 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpsj\" (UniqueName: \"kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj\") pod \"community-operators-kgmcs\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:44 crc kubenswrapper[4759]: I1205 02:13:44.976558 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:45 crc kubenswrapper[4759]: I1205 02:13:45.493034 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:45 crc kubenswrapper[4759]: W1205 02:13:45.498968 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7942115_3be7_44aa_acd2_c1e427cd824a.slice/crio-7bdbee75b749299532dbe8908c7b669fb7e982a13e58d75e23e82f193400dcb0 WatchSource:0}: Error finding container 7bdbee75b749299532dbe8908c7b669fb7e982a13e58d75e23e82f193400dcb0: Status 404 returned error can't find the container with id 7bdbee75b749299532dbe8908c7b669fb7e982a13e58d75e23e82f193400dcb0 Dec 05 02:13:46 crc kubenswrapper[4759]: I1205 02:13:46.155994 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:13:46 crc kubenswrapper[4759]: E1205 02:13:46.156783 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:46 crc kubenswrapper[4759]: I1205 02:13:46.220523 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerID="6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590" exitCode=0 Dec 05 02:13:46 crc kubenswrapper[4759]: I1205 02:13:46.220578 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerDied","Data":"6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590"} Dec 05 02:13:46 crc kubenswrapper[4759]: I1205 02:13:46.220612 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerStarted","Data":"7bdbee75b749299532dbe8908c7b669fb7e982a13e58d75e23e82f193400dcb0"} Dec 05 02:13:47 crc kubenswrapper[4759]: I1205 02:13:47.234893 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerStarted","Data":"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7"} Dec 05 02:13:48 crc kubenswrapper[4759]: I1205 02:13:48.255707 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerID="1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7" exitCode=0 Dec 05 02:13:48 crc kubenswrapper[4759]: I1205 02:13:48.255800 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerDied","Data":"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7"} Dec 05 02:13:49 crc kubenswrapper[4759]: I1205 02:13:49.267562 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerStarted","Data":"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52"} Dec 05 02:13:49 crc kubenswrapper[4759]: I1205 02:13:49.285656 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgmcs" podStartSLOduration=2.7717255549999997 podStartE2EDuration="5.285613974s" podCreationTimestamp="2025-12-05 02:13:44 +0000 UTC" firstStartedPulling="2025-12-05 02:13:46.227023237 +0000 UTC m=+6645.442684187" lastFinishedPulling="2025-12-05 02:13:48.740911656 +0000 UTC m=+6647.956572606" observedRunningTime="2025-12-05 02:13:49.283600095 +0000 UTC m=+6648.499261055" watchObservedRunningTime="2025-12-05 02:13:49.285613974 +0000 UTC m=+6648.501274924" Dec 05 02:13:54 crc kubenswrapper[4759]: I1205 02:13:54.977697 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:54 crc kubenswrapper[4759]: I1205 02:13:54.978395 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:55 crc kubenswrapper[4759]: I1205 02:13:55.054537 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:55 crc kubenswrapper[4759]: I1205 02:13:55.412123 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:55 crc kubenswrapper[4759]: I1205 02:13:55.466020 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:57 crc kubenswrapper[4759]: I1205 02:13:57.369873 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgmcs" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="registry-server" containerID="cri-o://f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52" gracePeriod=2 Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.141207 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.156661 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:13:58 crc kubenswrapper[4759]: E1205 02:13:58.157098 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.218856 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content\") pod \"a7942115-3be7-44aa-acd2-c1e427cd824a\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.219369 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities\") pod \"a7942115-3be7-44aa-acd2-c1e427cd824a\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.219423 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htpsj\" (UniqueName: \"kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj\") pod \"a7942115-3be7-44aa-acd2-c1e427cd824a\" (UID: \"a7942115-3be7-44aa-acd2-c1e427cd824a\") " Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.220035 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities" (OuterVolumeSpecName: "utilities") pod "a7942115-3be7-44aa-acd2-c1e427cd824a" (UID: "a7942115-3be7-44aa-acd2-c1e427cd824a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.220476 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.225750 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj" (OuterVolumeSpecName: "kube-api-access-htpsj") pod "a7942115-3be7-44aa-acd2-c1e427cd824a" (UID: "a7942115-3be7-44aa-acd2-c1e427cd824a"). InnerVolumeSpecName "kube-api-access-htpsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.324844 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htpsj\" (UniqueName: \"kubernetes.io/projected/a7942115-3be7-44aa-acd2-c1e427cd824a-kube-api-access-htpsj\") on node \"crc\" DevicePath \"\"" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.325880 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7942115-3be7-44aa-acd2-c1e427cd824a" (UID: "a7942115-3be7-44aa-acd2-c1e427cd824a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.380683 4759 generic.go:334] "Generic (PLEG): container finished" podID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerID="f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52" exitCode=0 Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.380724 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerDied","Data":"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52"} Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.380744 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmcs" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.380758 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmcs" event={"ID":"a7942115-3be7-44aa-acd2-c1e427cd824a","Type":"ContainerDied","Data":"7bdbee75b749299532dbe8908c7b669fb7e982a13e58d75e23e82f193400dcb0"} Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.380791 4759 scope.go:117] "RemoveContainer" containerID="f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.422769 4759 scope.go:117] "RemoveContainer" containerID="1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.426245 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7942115-3be7-44aa-acd2-c1e427cd824a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.429302 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.440840 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgmcs"] Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.455614 4759 scope.go:117] "RemoveContainer" containerID="6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.507850 4759 scope.go:117] "RemoveContainer" containerID="f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52" Dec 05 02:13:58 crc kubenswrapper[4759]: E1205 02:13:58.508357 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52\": container with ID starting with f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52 not found: ID does not exist" containerID="f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.508409 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52"} err="failed to get container status \"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52\": rpc error: code = NotFound desc = could not find container \"f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52\": container with ID starting with f35edd6bb5d446c813980489120b8208287e80000c61284feefaa66a5a141c52 not found: ID does not exist" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.508439 4759 scope.go:117] "RemoveContainer" containerID="1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7" Dec 05 02:13:58 crc kubenswrapper[4759]: E1205 02:13:58.508733 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7\": container with ID starting with 1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7 not found: ID does not exist" containerID="1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.508774 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7"} err="failed to get container status \"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7\": rpc error: code = NotFound desc = could not find container \"1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7\": container with ID starting with 1a8ef358d82fb70c841ff699931e46a03abd467465b4e7d5138ec31de62a07d7 not found: ID does not exist" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.508797 4759 scope.go:117] "RemoveContainer" containerID="6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590" Dec 05 02:13:58 crc kubenswrapper[4759]: E1205 02:13:58.509004 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590\": container with ID starting with 6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590 not found: ID does not exist" containerID="6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590" Dec 05 02:13:58 crc kubenswrapper[4759]: I1205 02:13:58.509029 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590"} err="failed to get container status \"6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590\": rpc error: code = NotFound desc = could not find container \"6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590\": container with ID starting with 6d8eb5df0ac579fb027e922efaa79bc1315d1c0842d62067b3d0116612f28590 not found: ID does not exist" Dec 05 02:13:59 crc kubenswrapper[4759]: I1205 02:13:59.176711 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" path="/var/lib/kubelet/pods/a7942115-3be7-44aa-acd2-c1e427cd824a/volumes" Dec 05 02:14:13 crc kubenswrapper[4759]: I1205 02:14:13.155938 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:14:13 crc kubenswrapper[4759]: I1205 02:14:13.550029 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203"} Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.221945 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz"] Dec 05 02:15:00 crc kubenswrapper[4759]: E1205 02:15:00.223004 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="registry-server" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.223019 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="registry-server" Dec 05 02:15:00 crc kubenswrapper[4759]: E1205 02:15:00.223046 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="extract-content" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.223054 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="extract-content" Dec 05 02:15:00 crc kubenswrapper[4759]: E1205 02:15:00.223066 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="extract-utilities" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.223072 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="extract-utilities" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.223327 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7942115-3be7-44aa-acd2-c1e427cd824a" containerName="registry-server" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.224146 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.234828 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz"] Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.261033 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.261041 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.342409 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jpc\" (UniqueName: \"kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.342476 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.342604 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.444871 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jpc\" (UniqueName: \"kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.444955 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.445230 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.446536 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.460169 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.467523 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jpc\" (UniqueName: \"kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc\") pod \"collect-profiles-29415015-hkcvz\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:00 crc kubenswrapper[4759]: I1205 02:15:00.553194 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:01 crc kubenswrapper[4759]: I1205 02:15:01.047757 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz"] Dec 05 02:15:01 crc kubenswrapper[4759]: I1205 02:15:01.142862 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" event={"ID":"73f1e9e7-90c1-4623-8ba8-af9d32d72250","Type":"ContainerStarted","Data":"89b2fe0b836dcb48c2746820d37839f33e4658a1ac37ece61607865db21008df"} Dec 05 02:15:02 crc kubenswrapper[4759]: I1205 02:15:02.157379 4759 generic.go:334] "Generic (PLEG): container finished" podID="73f1e9e7-90c1-4623-8ba8-af9d32d72250" containerID="87865ea1102141ebea11491b64bc4f54232def3dacf30ca41ccbd1a89a6ca501" exitCode=0 Dec 05 02:15:02 crc kubenswrapper[4759]: I1205 02:15:02.157486 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" event={"ID":"73f1e9e7-90c1-4623-8ba8-af9d32d72250","Type":"ContainerDied","Data":"87865ea1102141ebea11491b64bc4f54232def3dacf30ca41ccbd1a89a6ca501"} Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.646240 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.729542 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume\") pod \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.729709 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4jpc\" (UniqueName: \"kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc\") pod \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.729758 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume\") pod \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\" (UID: \"73f1e9e7-90c1-4623-8ba8-af9d32d72250\") " Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.730416 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume" (OuterVolumeSpecName: "config-volume") pod "73f1e9e7-90c1-4623-8ba8-af9d32d72250" (UID: "73f1e9e7-90c1-4623-8ba8-af9d32d72250"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.738994 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc" (OuterVolumeSpecName: "kube-api-access-n4jpc") pod "73f1e9e7-90c1-4623-8ba8-af9d32d72250" (UID: "73f1e9e7-90c1-4623-8ba8-af9d32d72250"). InnerVolumeSpecName "kube-api-access-n4jpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.739812 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73f1e9e7-90c1-4623-8ba8-af9d32d72250" (UID: "73f1e9e7-90c1-4623-8ba8-af9d32d72250"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.831897 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4jpc\" (UniqueName: \"kubernetes.io/projected/73f1e9e7-90c1-4623-8ba8-af9d32d72250-kube-api-access-n4jpc\") on node \"crc\" DevicePath \"\"" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.831926 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f1e9e7-90c1-4623-8ba8-af9d32d72250-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:15:03 crc kubenswrapper[4759]: I1205 02:15:03.831937 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f1e9e7-90c1-4623-8ba8-af9d32d72250-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:15:04 crc kubenswrapper[4759]: I1205 02:15:04.183823 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" event={"ID":"73f1e9e7-90c1-4623-8ba8-af9d32d72250","Type":"ContainerDied","Data":"89b2fe0b836dcb48c2746820d37839f33e4658a1ac37ece61607865db21008df"} Dec 05 02:15:04 crc kubenswrapper[4759]: I1205 02:15:04.183867 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b2fe0b836dcb48c2746820d37839f33e4658a1ac37ece61607865db21008df" Dec 05 02:15:04 crc kubenswrapper[4759]: I1205 02:15:04.183877 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415015-hkcvz" Dec 05 02:15:04 crc kubenswrapper[4759]: I1205 02:15:04.730978 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx"] Dec 05 02:15:04 crc kubenswrapper[4759]: I1205 02:15:04.740849 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414970-wnjpx"] Dec 05 02:15:05 crc kubenswrapper[4759]: I1205 02:15:05.178816 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1b96e7-6a9d-4297-8bb8-788206857735" path="/var/lib/kubelet/pods/1b1b96e7-6a9d-4297-8bb8-788206857735/volumes" Dec 05 02:15:38 crc kubenswrapper[4759]: I1205 02:15:38.137421 4759 scope.go:117] "RemoveContainer" containerID="346ad81de72ebcd144e403f017220963b13430557eecdf2bc3468ec12c853ec1" Dec 05 02:16:34 crc kubenswrapper[4759]: I1205 02:16:34.433134 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:16:34 crc kubenswrapper[4759]: I1205 02:16:34.434054 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:17:04 crc kubenswrapper[4759]: I1205 02:17:04.433083 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:17:04 crc kubenswrapper[4759]: I1205 02:17:04.433641 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:17:23 crc kubenswrapper[4759]: I1205 02:17:23.990191 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:23 crc kubenswrapper[4759]: E1205 02:17:23.991569 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f1e9e7-90c1-4623-8ba8-af9d32d72250" containerName="collect-profiles" Dec 05 02:17:23 crc kubenswrapper[4759]: I1205 02:17:23.991591 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f1e9e7-90c1-4623-8ba8-af9d32d72250" containerName="collect-profiles" Dec 05 02:17:23 crc kubenswrapper[4759]: I1205 02:17:23.992033 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f1e9e7-90c1-4623-8ba8-af9d32d72250" containerName="collect-profiles" Dec 05 02:17:23 crc kubenswrapper[4759]: I1205 02:17:23.996008 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.019212 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.135621 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.136024 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxgxj\" (UniqueName: \"kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.136125 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.238596 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxgxj\" (UniqueName: \"kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.238872 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.239083 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.239829 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.240414 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.262406 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxgxj\" (UniqueName: \"kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj\") pod \"redhat-marketplace-xk5nk\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.334765 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.851542 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:24 crc kubenswrapper[4759]: I1205 02:17:24.997730 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerStarted","Data":"c923ff79ee9057c880316a1c1648c35819c57284ab9926476c2fea40cac8fe1c"} Dec 05 02:17:26 crc kubenswrapper[4759]: I1205 02:17:26.010635 4759 generic.go:334] "Generic (PLEG): container finished" podID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerID="c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc" exitCode=0 Dec 05 02:17:26 crc kubenswrapper[4759]: I1205 02:17:26.010745 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerDied","Data":"c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc"} Dec 05 02:17:26 crc kubenswrapper[4759]: I1205 02:17:26.014832 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:17:27 crc kubenswrapper[4759]: I1205 02:17:27.024876 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerStarted","Data":"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe"} Dec 05 02:17:28 crc kubenswrapper[4759]: I1205 02:17:28.035657 4759 generic.go:334] "Generic (PLEG): container finished" podID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerID="0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe" exitCode=0 Dec 05 02:17:28 crc kubenswrapper[4759]: I1205 02:17:28.035794 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerDied","Data":"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe"} Dec 05 02:17:29 crc kubenswrapper[4759]: I1205 02:17:29.046540 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerStarted","Data":"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81"} Dec 05 02:17:29 crc kubenswrapper[4759]: I1205 02:17:29.066045 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xk5nk" podStartSLOduration=3.631048394 podStartE2EDuration="6.065991523s" podCreationTimestamp="2025-12-05 02:17:23 +0000 UTC" firstStartedPulling="2025-12-05 02:17:26.013864274 +0000 UTC m=+6865.229525244" lastFinishedPulling="2025-12-05 02:17:28.448807413 +0000 UTC m=+6867.664468373" observedRunningTime="2025-12-05 02:17:29.063146384 +0000 UTC m=+6868.278807334" watchObservedRunningTime="2025-12-05 02:17:29.065991523 +0000 UTC m=+6868.281652473" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.334859 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.335723 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.422535 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.433139 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.433198 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.433246 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.434091 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:17:34 crc kubenswrapper[4759]: I1205 02:17:34.434181 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203" gracePeriod=600 Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.119336 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203"} Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.119957 4759 scope.go:117] "RemoveContainer" containerID="c802cdc6aae4a36226e42e9c47958cba0427e2d2960d132bcd95ebfd7805a2cf" Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.119336 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203" exitCode=0 Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.120104 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854"} Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.206772 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:35 crc kubenswrapper[4759]: I1205 02:17:35.268854 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.154584 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xk5nk" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="registry-server" containerID="cri-o://b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81" gracePeriod=2 Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.697456 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.814072 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities\") pod \"f991f55b-0752-48c4-beb5-a5481dcea4ad\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.814265 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content\") pod \"f991f55b-0752-48c4-beb5-a5481dcea4ad\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.814497 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxgxj\" (UniqueName: \"kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj\") pod \"f991f55b-0752-48c4-beb5-a5481dcea4ad\" (UID: \"f991f55b-0752-48c4-beb5-a5481dcea4ad\") " Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.815061 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities" (OuterVolumeSpecName: "utilities") pod "f991f55b-0752-48c4-beb5-a5481dcea4ad" (UID: "f991f55b-0752-48c4-beb5-a5481dcea4ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.815198 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.825486 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj" (OuterVolumeSpecName: "kube-api-access-jxgxj") pod "f991f55b-0752-48c4-beb5-a5481dcea4ad" (UID: "f991f55b-0752-48c4-beb5-a5481dcea4ad"). InnerVolumeSpecName "kube-api-access-jxgxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.841915 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f991f55b-0752-48c4-beb5-a5481dcea4ad" (UID: "f991f55b-0752-48c4-beb5-a5481dcea4ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.917884 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxgxj\" (UniqueName: \"kubernetes.io/projected/f991f55b-0752-48c4-beb5-a5481dcea4ad-kube-api-access-jxgxj\") on node \"crc\" DevicePath \"\"" Dec 05 02:17:37 crc kubenswrapper[4759]: I1205 02:17:37.918126 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991f55b-0752-48c4-beb5-a5481dcea4ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.172575 4759 generic.go:334] "Generic (PLEG): container finished" podID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerID="b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81" exitCode=0 Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.172660 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk5nk" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.172666 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerDied","Data":"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81"} Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.174338 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk5nk" event={"ID":"f991f55b-0752-48c4-beb5-a5481dcea4ad","Type":"ContainerDied","Data":"c923ff79ee9057c880316a1c1648c35819c57284ab9926476c2fea40cac8fe1c"} Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.174385 4759 scope.go:117] "RemoveContainer" containerID="b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.220811 4759 scope.go:117] "RemoveContainer" containerID="0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.233083 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.256689 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk5nk"] Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.266182 4759 scope.go:117] "RemoveContainer" containerID="c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.327601 4759 scope.go:117] "RemoveContainer" containerID="b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81" Dec 05 02:17:38 crc kubenswrapper[4759]: E1205 02:17:38.330034 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81\": container with ID starting with b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81 not found: ID does not exist" containerID="b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.330103 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81"} err="failed to get container status \"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81\": rpc error: code = NotFound desc = could not find container \"b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81\": container with ID starting with b9918662a28023dc4fcbade6f1b929e919a1cd283ba015090ac4fa250d331c81 not found: ID does not exist" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.330146 4759 scope.go:117] "RemoveContainer" containerID="0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe" Dec 05 02:17:38 crc kubenswrapper[4759]: E1205 02:17:38.330952 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe\": container with ID starting with 0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe not found: ID does not exist" containerID="0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.331007 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe"} err="failed to get container status \"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe\": rpc error: code = NotFound desc = could not find container \"0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe\": container with ID starting with 0bd1cb5a34b420b28a418364101eaf8efc606dce0f0745c1c4036bf1613c29fe not found: ID does not exist" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.331045 4759 scope.go:117] "RemoveContainer" containerID="c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc" Dec 05 02:17:38 crc kubenswrapper[4759]: E1205 02:17:38.331523 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc\": container with ID starting with c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc not found: ID does not exist" containerID="c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc" Dec 05 02:17:38 crc kubenswrapper[4759]: I1205 02:17:38.331562 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc"} err="failed to get container status \"c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc\": rpc error: code = NotFound desc = could not find container \"c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc\": container with ID starting with c336beec9fed50151725b3503e57ee18480f8c103af4975c7098304bacc5f4bc not found: ID does not exist" Dec 05 02:17:39 crc kubenswrapper[4759]: I1205 02:17:39.170621 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" path="/var/lib/kubelet/pods/f991f55b-0752-48c4-beb5-a5481dcea4ad/volumes" Dec 05 02:19:34 crc kubenswrapper[4759]: I1205 02:19:34.433523 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:19:34 crc kubenswrapper[4759]: I1205 02:19:34.434207 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:20:04 crc kubenswrapper[4759]: I1205 02:20:04.434041 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:20:04 crc kubenswrapper[4759]: I1205 02:20:04.434770 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.536005 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:22 crc kubenswrapper[4759]: E1205 02:20:22.537383 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="extract-content" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.537401 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="extract-content" Dec 05 02:20:22 crc kubenswrapper[4759]: E1205 02:20:22.537418 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="extract-utilities" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.537426 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="extract-utilities" Dec 05 02:20:22 crc kubenswrapper[4759]: E1205 02:20:22.537452 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="registry-server" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.537463 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="registry-server" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.537805 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f991f55b-0752-48c4-beb5-a5481dcea4ad" containerName="registry-server" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.539750 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.556718 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.648926 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.649000 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vvh\" (UniqueName: \"kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.649033 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.751922 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.752013 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vvh\" (UniqueName: \"kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.752043 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.753150 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.753190 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.781351 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vvh\" (UniqueName: \"kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh\") pod \"redhat-operators-2gdq6\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:22 crc kubenswrapper[4759]: I1205 02:20:22.907597 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:23 crc kubenswrapper[4759]: I1205 02:20:23.441228 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:23 crc kubenswrapper[4759]: W1205 02:20:23.456127 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc024b9c6_341f_4277_b051_7b988b17834e.slice/crio-cb60672c9d2df1602802678154b6a1e3f6c99c8ac507b1f7ce71f11849f67915 WatchSource:0}: Error finding container cb60672c9d2df1602802678154b6a1e3f6c99c8ac507b1f7ce71f11849f67915: Status 404 returned error can't find the container with id cb60672c9d2df1602802678154b6a1e3f6c99c8ac507b1f7ce71f11849f67915 Dec 05 02:20:24 crc kubenswrapper[4759]: I1205 02:20:24.360623 4759 generic.go:334] "Generic (PLEG): container finished" podID="c024b9c6-341f-4277-b051-7b988b17834e" containerID="e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766" exitCode=0 Dec 05 02:20:24 crc kubenswrapper[4759]: I1205 02:20:24.360674 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerDied","Data":"e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766"} Dec 05 02:20:24 crc kubenswrapper[4759]: I1205 02:20:24.360966 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerStarted","Data":"cb60672c9d2df1602802678154b6a1e3f6c99c8ac507b1f7ce71f11849f67915"} Dec 05 02:20:25 crc kubenswrapper[4759]: I1205 02:20:25.374140 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerStarted","Data":"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429"} Dec 05 02:20:28 crc kubenswrapper[4759]: I1205 02:20:28.430255 4759 generic.go:334] "Generic (PLEG): container finished" podID="c024b9c6-341f-4277-b051-7b988b17834e" containerID="ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429" exitCode=0 Dec 05 02:20:28 crc kubenswrapper[4759]: I1205 02:20:28.430526 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerDied","Data":"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429"} Dec 05 02:20:29 crc kubenswrapper[4759]: I1205 02:20:29.443872 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerStarted","Data":"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab"} Dec 05 02:20:29 crc kubenswrapper[4759]: I1205 02:20:29.466934 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2gdq6" podStartSLOduration=2.727406512 podStartE2EDuration="7.466901981s" podCreationTimestamp="2025-12-05 02:20:22 +0000 UTC" firstStartedPulling="2025-12-05 02:20:24.362629756 +0000 UTC m=+7043.578290706" lastFinishedPulling="2025-12-05 02:20:29.102125225 +0000 UTC m=+7048.317786175" observedRunningTime="2025-12-05 02:20:29.461355446 +0000 UTC m=+7048.677016396" watchObservedRunningTime="2025-12-05 02:20:29.466901981 +0000 UTC m=+7048.682562931" Dec 05 02:20:32 crc kubenswrapper[4759]: I1205 02:20:32.908535 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:32 crc kubenswrapper[4759]: I1205 02:20:32.908891 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:33 crc kubenswrapper[4759]: I1205 02:20:33.975210 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2gdq6" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="registry-server" probeResult="failure" output=< Dec 05 02:20:33 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:20:33 crc kubenswrapper[4759]: > Dec 05 02:20:34 crc kubenswrapper[4759]: I1205 02:20:34.441466 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:20:34 crc kubenswrapper[4759]: I1205 02:20:34.441862 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:20:34 crc kubenswrapper[4759]: I1205 02:20:34.441918 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:20:34 crc kubenswrapper[4759]: I1205 02:20:34.443151 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:20:34 crc kubenswrapper[4759]: I1205 02:20:34.443497 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" gracePeriod=600 Dec 05 02:20:34 crc kubenswrapper[4759]: E1205 02:20:34.574048 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:20:35 crc kubenswrapper[4759]: I1205 02:20:35.507668 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" exitCode=0 Dec 05 02:20:35 crc kubenswrapper[4759]: I1205 02:20:35.507744 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854"} Dec 05 02:20:35 crc kubenswrapper[4759]: I1205 02:20:35.508083 4759 scope.go:117] "RemoveContainer" containerID="1452a232d4038c13cfb821d97797541014126c3820e692c4a5e56faf9de35203" Dec 05 02:20:35 crc kubenswrapper[4759]: I1205 02:20:35.509094 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:20:35 crc kubenswrapper[4759]: E1205 02:20:35.509460 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:20:42 crc kubenswrapper[4759]: I1205 02:20:42.986119 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:43 crc kubenswrapper[4759]: I1205 02:20:43.058916 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:43 crc kubenswrapper[4759]: I1205 02:20:43.278155 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:44 crc kubenswrapper[4759]: I1205 02:20:44.612649 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2gdq6" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="registry-server" containerID="cri-o://02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab" gracePeriod=2 Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.154025 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.265527 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9vvh\" (UniqueName: \"kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh\") pod \"c024b9c6-341f-4277-b051-7b988b17834e\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.265748 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content\") pod \"c024b9c6-341f-4277-b051-7b988b17834e\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.265982 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities\") pod \"c024b9c6-341f-4277-b051-7b988b17834e\" (UID: \"c024b9c6-341f-4277-b051-7b988b17834e\") " Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.267063 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities" (OuterVolumeSpecName: "utilities") pod "c024b9c6-341f-4277-b051-7b988b17834e" (UID: "c024b9c6-341f-4277-b051-7b988b17834e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.268479 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.283444 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh" (OuterVolumeSpecName: "kube-api-access-x9vvh") pod "c024b9c6-341f-4277-b051-7b988b17834e" (UID: "c024b9c6-341f-4277-b051-7b988b17834e"). InnerVolumeSpecName "kube-api-access-x9vvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.371416 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9vvh\" (UniqueName: \"kubernetes.io/projected/c024b9c6-341f-4277-b051-7b988b17834e-kube-api-access-x9vvh\") on node \"crc\" DevicePath \"\"" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.421410 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c024b9c6-341f-4277-b051-7b988b17834e" (UID: "c024b9c6-341f-4277-b051-7b988b17834e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.473544 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c024b9c6-341f-4277-b051-7b988b17834e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.629286 4759 generic.go:334] "Generic (PLEG): container finished" podID="c024b9c6-341f-4277-b051-7b988b17834e" containerID="02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab" exitCode=0 Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.629354 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerDied","Data":"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab"} Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.629387 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gdq6" event={"ID":"c024b9c6-341f-4277-b051-7b988b17834e","Type":"ContainerDied","Data":"cb60672c9d2df1602802678154b6a1e3f6c99c8ac507b1f7ce71f11849f67915"} Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.629409 4759 scope.go:117] "RemoveContainer" containerID="02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.629441 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gdq6" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.669719 4759 scope.go:117] "RemoveContainer" containerID="ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.673375 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.682723 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2gdq6"] Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.691788 4759 scope.go:117] "RemoveContainer" containerID="e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.754009 4759 scope.go:117] "RemoveContainer" containerID="02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab" Dec 05 02:20:45 crc kubenswrapper[4759]: E1205 02:20:45.754482 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab\": container with ID starting with 02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab not found: ID does not exist" containerID="02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.754511 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab"} err="failed to get container status \"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab\": rpc error: code = NotFound desc = could not find container \"02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab\": container with ID starting with 02e783a7f3587c3ba353a6344ee422e095e3c58cfe34ff9abb48fd0f8e3e5dab not found: ID does not exist" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.754533 4759 scope.go:117] "RemoveContainer" containerID="ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429" Dec 05 02:20:45 crc kubenswrapper[4759]: E1205 02:20:45.754986 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429\": container with ID starting with ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429 not found: ID does not exist" containerID="ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.755004 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429"} err="failed to get container status \"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429\": rpc error: code = NotFound desc = could not find container \"ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429\": container with ID starting with ae565498737ec0d8f494b79fef80514f50838e626def4933ecd06b39a0890429 not found: ID does not exist" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.755017 4759 scope.go:117] "RemoveContainer" containerID="e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766" Dec 05 02:20:45 crc kubenswrapper[4759]: E1205 02:20:45.755421 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766\": container with ID starting with e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766 not found: ID does not exist" containerID="e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766" Dec 05 02:20:45 crc kubenswrapper[4759]: I1205 02:20:45.755440 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766"} err="failed to get container status \"e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766\": rpc error: code = NotFound desc = could not find container \"e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766\": container with ID starting with e0ff467482b114bce1f9e7ecfe61db52b7c06e9e177ea5f4784e9bc51af12766 not found: ID does not exist" Dec 05 02:20:47 crc kubenswrapper[4759]: I1205 02:20:47.171198 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c024b9c6-341f-4277-b051-7b988b17834e" path="/var/lib/kubelet/pods/c024b9c6-341f-4277-b051-7b988b17834e/volumes" Dec 05 02:20:49 crc kubenswrapper[4759]: I1205 02:20:49.156541 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:20:49 crc kubenswrapper[4759]: E1205 02:20:49.157829 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:00 crc kubenswrapper[4759]: I1205 02:21:00.156700 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:21:00 crc kubenswrapper[4759]: E1205 02:21:00.157651 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:11 crc kubenswrapper[4759]: I1205 02:21:11.171485 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:21:11 crc kubenswrapper[4759]: E1205 02:21:11.172324 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:22 crc kubenswrapper[4759]: I1205 02:21:22.157451 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:21:22 crc kubenswrapper[4759]: E1205 02:21:22.158897 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:35 crc kubenswrapper[4759]: I1205 02:21:35.157145 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:21:35 crc kubenswrapper[4759]: E1205 02:21:35.157917 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:46 crc kubenswrapper[4759]: I1205 02:21:46.155744 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:21:46 crc kubenswrapper[4759]: E1205 02:21:46.156511 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.484814 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:21:52 crc kubenswrapper[4759]: E1205 02:21:52.485787 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="extract-utilities" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.485800 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="extract-utilities" Dec 05 02:21:52 crc kubenswrapper[4759]: E1205 02:21:52.485813 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="registry-server" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.485819 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="registry-server" Dec 05 02:21:52 crc kubenswrapper[4759]: E1205 02:21:52.485846 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="extract-content" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.485854 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="extract-content" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.486084 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c024b9c6-341f-4277-b051-7b988b17834e" containerName="registry-server" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.487671 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.497782 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.641013 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.641145 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.641171 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq47z\" (UniqueName: \"kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.743788 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.743873 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.743915 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq47z\" (UniqueName: \"kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.744516 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.744843 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.770957 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq47z\" (UniqueName: \"kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z\") pod \"certified-operators-rwxq6\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:52 crc kubenswrapper[4759]: I1205 02:21:52.839428 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:21:53 crc kubenswrapper[4759]: I1205 02:21:53.377412 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:21:53 crc kubenswrapper[4759]: I1205 02:21:53.561068 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerStarted","Data":"32ddf9a003a1c541390ba705011a0ef94145fa7c22e6c7b6791f6b256fc8943a"} Dec 05 02:21:54 crc kubenswrapper[4759]: I1205 02:21:54.578248 4759 generic.go:334] "Generic (PLEG): container finished" podID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerID="51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40" exitCode=0 Dec 05 02:21:54 crc kubenswrapper[4759]: I1205 02:21:54.578349 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerDied","Data":"51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40"} Dec 05 02:21:55 crc kubenswrapper[4759]: I1205 02:21:55.593809 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerStarted","Data":"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6"} Dec 05 02:21:57 crc kubenswrapper[4759]: I1205 02:21:57.621501 4759 generic.go:334] "Generic (PLEG): container finished" podID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerID="83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6" exitCode=0 Dec 05 02:21:57 crc kubenswrapper[4759]: I1205 02:21:57.621594 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerDied","Data":"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6"} Dec 05 02:21:58 crc kubenswrapper[4759]: I1205 02:21:58.644232 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerStarted","Data":"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1"} Dec 05 02:21:58 crc kubenswrapper[4759]: I1205 02:21:58.674921 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rwxq6" podStartSLOduration=3.084641089 podStartE2EDuration="6.674893088s" podCreationTimestamp="2025-12-05 02:21:52 +0000 UTC" firstStartedPulling="2025-12-05 02:21:54.582941147 +0000 UTC m=+7133.798602147" lastFinishedPulling="2025-12-05 02:21:58.173193156 +0000 UTC m=+7137.388854146" observedRunningTime="2025-12-05 02:21:58.669290062 +0000 UTC m=+7137.884951022" watchObservedRunningTime="2025-12-05 02:21:58.674893088 +0000 UTC m=+7137.890554048" Dec 05 02:22:01 crc kubenswrapper[4759]: I1205 02:22:01.166195 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:22:01 crc kubenswrapper[4759]: E1205 02:22:01.166913 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:22:02 crc kubenswrapper[4759]: I1205 02:22:02.840275 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:02 crc kubenswrapper[4759]: I1205 02:22:02.840467 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:02 crc kubenswrapper[4759]: I1205 02:22:02.921991 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:03 crc kubenswrapper[4759]: I1205 02:22:03.803918 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:03 crc kubenswrapper[4759]: I1205 02:22:03.876363 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:22:05 crc kubenswrapper[4759]: I1205 02:22:05.727782 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rwxq6" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="registry-server" containerID="cri-o://6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1" gracePeriod=2 Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.317616 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.357820 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq47z\" (UniqueName: \"kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z\") pod \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.358966 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities\") pod \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.359098 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content\") pod \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\" (UID: \"02aac88c-8c77-4e2d-9c97-f5fb190d640d\") " Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.360535 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities" (OuterVolumeSpecName: "utilities") pod "02aac88c-8c77-4e2d-9c97-f5fb190d640d" (UID: "02aac88c-8c77-4e2d-9c97-f5fb190d640d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.367222 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z" (OuterVolumeSpecName: "kube-api-access-rq47z") pod "02aac88c-8c77-4e2d-9c97-f5fb190d640d" (UID: "02aac88c-8c77-4e2d-9c97-f5fb190d640d"). InnerVolumeSpecName "kube-api-access-rq47z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.452602 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02aac88c-8c77-4e2d-9c97-f5fb190d640d" (UID: "02aac88c-8c77-4e2d-9c97-f5fb190d640d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.463542 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.463575 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aac88c-8c77-4e2d-9c97-f5fb190d640d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.463586 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq47z\" (UniqueName: \"kubernetes.io/projected/02aac88c-8c77-4e2d-9c97-f5fb190d640d-kube-api-access-rq47z\") on node \"crc\" DevicePath \"\"" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.746548 4759 generic.go:334] "Generic (PLEG): container finished" podID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerID="6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1" exitCode=0 Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.746598 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerDied","Data":"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1"} Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.746631 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwxq6" event={"ID":"02aac88c-8c77-4e2d-9c97-f5fb190d640d","Type":"ContainerDied","Data":"32ddf9a003a1c541390ba705011a0ef94145fa7c22e6c7b6791f6b256fc8943a"} Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.746651 4759 scope.go:117] "RemoveContainer" containerID="6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.746724 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwxq6" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.794830 4759 scope.go:117] "RemoveContainer" containerID="83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.799831 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.810415 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rwxq6"] Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.817985 4759 scope.go:117] "RemoveContainer" containerID="51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.879145 4759 scope.go:117] "RemoveContainer" containerID="6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1" Dec 05 02:22:06 crc kubenswrapper[4759]: E1205 02:22:06.879592 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1\": container with ID starting with 6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1 not found: ID does not exist" containerID="6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.879636 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1"} err="failed to get container status \"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1\": rpc error: code = NotFound desc = could not find container \"6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1\": container with ID starting with 6a6a1d66c4c9c8bb45e515f5a5ce4c1913445e9487bf64d67a8e27ba97c4e3d1 not found: ID does not exist" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.879664 4759 scope.go:117] "RemoveContainer" containerID="83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6" Dec 05 02:22:06 crc kubenswrapper[4759]: E1205 02:22:06.879971 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6\": container with ID starting with 83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6 not found: ID does not exist" containerID="83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.880013 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6"} err="failed to get container status \"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6\": rpc error: code = NotFound desc = could not find container \"83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6\": container with ID starting with 83517016ccb0529d249abe1d830a5dbdccdfb9edb43f1c79e5a351c629bab5d6 not found: ID does not exist" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.880040 4759 scope.go:117] "RemoveContainer" containerID="51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40" Dec 05 02:22:06 crc kubenswrapper[4759]: E1205 02:22:06.880525 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40\": container with ID starting with 51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40 not found: ID does not exist" containerID="51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40" Dec 05 02:22:06 crc kubenswrapper[4759]: I1205 02:22:06.880557 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40"} err="failed to get container status \"51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40\": rpc error: code = NotFound desc = could not find container \"51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40\": container with ID starting with 51dadc52f357ea9bb39bae9db15fd2a67c9a79816036c0ae4da1144a8edefc40 not found: ID does not exist" Dec 05 02:22:07 crc kubenswrapper[4759]: I1205 02:22:07.178891 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" path="/var/lib/kubelet/pods/02aac88c-8c77-4e2d-9c97-f5fb190d640d/volumes" Dec 05 02:22:15 crc kubenswrapper[4759]: I1205 02:22:15.156378 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:22:15 crc kubenswrapper[4759]: E1205 02:22:15.157536 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:22:28 crc kubenswrapper[4759]: I1205 02:22:28.156271 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:22:28 crc kubenswrapper[4759]: E1205 02:22:28.162268 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:22:42 crc kubenswrapper[4759]: I1205 02:22:42.156470 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:22:42 crc kubenswrapper[4759]: E1205 02:22:42.157752 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:22:56 crc kubenswrapper[4759]: I1205 02:22:56.160198 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:22:56 crc kubenswrapper[4759]: E1205 02:22:56.161051 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:23:07 crc kubenswrapper[4759]: I1205 02:23:07.156358 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:23:07 crc kubenswrapper[4759]: E1205 02:23:07.157343 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:23:21 crc kubenswrapper[4759]: I1205 02:23:21.168575 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:23:21 crc kubenswrapper[4759]: E1205 02:23:21.171340 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:23:34 crc kubenswrapper[4759]: I1205 02:23:34.156883 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:23:34 crc kubenswrapper[4759]: E1205 02:23:34.157915 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:23:47 crc kubenswrapper[4759]: I1205 02:23:47.156664 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:23:47 crc kubenswrapper[4759]: E1205 02:23:47.157816 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:24:02 crc kubenswrapper[4759]: I1205 02:24:02.156881 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:24:02 crc kubenswrapper[4759]: E1205 02:24:02.158009 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:24:14 crc kubenswrapper[4759]: I1205 02:24:14.155957 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:24:14 crc kubenswrapper[4759]: E1205 02:24:14.156874 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:24:25 crc kubenswrapper[4759]: I1205 02:24:25.156398 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:24:25 crc kubenswrapper[4759]: E1205 02:24:25.157603 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.759689 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:34 crc kubenswrapper[4759]: E1205 02:24:34.760819 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="registry-server" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.760836 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="registry-server" Dec 05 02:24:34 crc kubenswrapper[4759]: E1205 02:24:34.760863 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="extract-content" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.760871 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="extract-content" Dec 05 02:24:34 crc kubenswrapper[4759]: E1205 02:24:34.760887 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="extract-utilities" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.760895 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="extract-utilities" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.761146 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="02aac88c-8c77-4e2d-9c97-f5fb190d640d" containerName="registry-server" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.763089 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.773970 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.791976 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5z4l\" (UniqueName: \"kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.792071 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.792197 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.894804 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.894890 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5z4l\" (UniqueName: \"kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.895006 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.895449 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.895526 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:34 crc kubenswrapper[4759]: I1205 02:24:34.914548 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5z4l\" (UniqueName: \"kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l\") pod \"community-operators-p47jw\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:35 crc kubenswrapper[4759]: I1205 02:24:35.091178 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:35 crc kubenswrapper[4759]: I1205 02:24:35.703519 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:35 crc kubenswrapper[4759]: I1205 02:24:35.921497 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerStarted","Data":"12543790d819d8309329ec290e3a9e2025744dfc8d946d2d052c905cb6a5065d"} Dec 05 02:24:36 crc kubenswrapper[4759]: I1205 02:24:36.932799 4759 generic.go:334] "Generic (PLEG): container finished" podID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerID="c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e" exitCode=0 Dec 05 02:24:36 crc kubenswrapper[4759]: I1205 02:24:36.933083 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerDied","Data":"c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e"} Dec 05 02:24:36 crc kubenswrapper[4759]: I1205 02:24:36.936093 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:24:37 crc kubenswrapper[4759]: I1205 02:24:37.946000 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerStarted","Data":"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a"} Dec 05 02:24:39 crc kubenswrapper[4759]: I1205 02:24:39.155868 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:24:39 crc kubenswrapper[4759]: E1205 02:24:39.156549 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:24:39 crc kubenswrapper[4759]: I1205 02:24:39.972516 4759 generic.go:334] "Generic (PLEG): container finished" podID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerID="cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a" exitCode=0 Dec 05 02:24:39 crc kubenswrapper[4759]: I1205 02:24:39.972560 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerDied","Data":"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a"} Dec 05 02:24:40 crc kubenswrapper[4759]: I1205 02:24:40.994680 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerStarted","Data":"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1"} Dec 05 02:24:41 crc kubenswrapper[4759]: I1205 02:24:41.023258 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p47jw" podStartSLOduration=3.513315085 podStartE2EDuration="7.023231994s" podCreationTimestamp="2025-12-05 02:24:34 +0000 UTC" firstStartedPulling="2025-12-05 02:24:36.935611839 +0000 UTC m=+7296.151272829" lastFinishedPulling="2025-12-05 02:24:40.445528788 +0000 UTC m=+7299.661189738" observedRunningTime="2025-12-05 02:24:41.015621889 +0000 UTC m=+7300.231282859" watchObservedRunningTime="2025-12-05 02:24:41.023231994 +0000 UTC m=+7300.238892954" Dec 05 02:24:45 crc kubenswrapper[4759]: I1205 02:24:45.091466 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:45 crc kubenswrapper[4759]: I1205 02:24:45.092143 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:45 crc kubenswrapper[4759]: I1205 02:24:45.170699 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:46 crc kubenswrapper[4759]: I1205 02:24:46.108015 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:46 crc kubenswrapper[4759]: I1205 02:24:46.158529 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.075851 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p47jw" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="registry-server" containerID="cri-o://a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1" gracePeriod=2 Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.637151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.656781 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities\") pod \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.656925 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5z4l\" (UniqueName: \"kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l\") pod \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.657644 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities" (OuterVolumeSpecName: "utilities") pod "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" (UID: "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.657951 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content\") pod \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\" (UID: \"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e\") " Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.658651 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.664414 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l" (OuterVolumeSpecName: "kube-api-access-r5z4l") pod "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" (UID: "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e"). InnerVolumeSpecName "kube-api-access-r5z4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.724659 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" (UID: "ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.760908 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5z4l\" (UniqueName: \"kubernetes.io/projected/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-kube-api-access-r5z4l\") on node \"crc\" DevicePath \"\"" Dec 05 02:24:48 crc kubenswrapper[4759]: I1205 02:24:48.760938 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.092867 4759 generic.go:334] "Generic (PLEG): container finished" podID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerID="a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1" exitCode=0 Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.092933 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p47jw" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.092954 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerDied","Data":"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1"} Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.093524 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p47jw" event={"ID":"ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e","Type":"ContainerDied","Data":"12543790d819d8309329ec290e3a9e2025744dfc8d946d2d052c905cb6a5065d"} Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.093547 4759 scope.go:117] "RemoveContainer" containerID="a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.116423 4759 scope.go:117] "RemoveContainer" containerID="cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.139252 4759 scope.go:117] "RemoveContainer" containerID="c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.174097 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.174352 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p47jw"] Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.215167 4759 scope.go:117] "RemoveContainer" containerID="a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1" Dec 05 02:24:49 crc kubenswrapper[4759]: E1205 02:24:49.215804 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1\": container with ID starting with a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1 not found: ID does not exist" containerID="a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.215850 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1"} err="failed to get container status \"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1\": rpc error: code = NotFound desc = could not find container \"a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1\": container with ID starting with a3282ea62ea3e5967ba1b8b14e5b35b8644e3490cf05d99a7f45ead3f3ef17c1 not found: ID does not exist" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.215878 4759 scope.go:117] "RemoveContainer" containerID="cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a" Dec 05 02:24:49 crc kubenswrapper[4759]: E1205 02:24:49.216159 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a\": container with ID starting with cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a not found: ID does not exist" containerID="cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.216190 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a"} err="failed to get container status \"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a\": rpc error: code = NotFound desc = could not find container \"cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a\": container with ID starting with cbad522b19f7b95ac71a29229762083102684d5e7d67cea2efff41d02845320a not found: ID does not exist" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.216210 4759 scope.go:117] "RemoveContainer" containerID="c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e" Dec 05 02:24:49 crc kubenswrapper[4759]: E1205 02:24:49.217331 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e\": container with ID starting with c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e not found: ID does not exist" containerID="c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e" Dec 05 02:24:49 crc kubenswrapper[4759]: I1205 02:24:49.217363 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e"} err="failed to get container status \"c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e\": rpc error: code = NotFound desc = could not find container \"c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e\": container with ID starting with c1672472cc5f59c41c5110768eb71208141929d311815d03d77c516f6c0c464e not found: ID does not exist" Dec 05 02:24:51 crc kubenswrapper[4759]: I1205 02:24:51.174595 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" path="/var/lib/kubelet/pods/ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e/volumes" Dec 05 02:24:52 crc kubenswrapper[4759]: I1205 02:24:52.157460 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:24:52 crc kubenswrapper[4759]: E1205 02:24:52.158506 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:25:06 crc kubenswrapper[4759]: I1205 02:25:06.156465 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:25:06 crc kubenswrapper[4759]: E1205 02:25:06.157268 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:25:19 crc kubenswrapper[4759]: I1205 02:25:19.156733 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:25:19 crc kubenswrapper[4759]: E1205 02:25:19.159221 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:25:33 crc kubenswrapper[4759]: I1205 02:25:33.156812 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:25:33 crc kubenswrapper[4759]: E1205 02:25:33.157671 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:25:45 crc kubenswrapper[4759]: I1205 02:25:45.155709 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:25:45 crc kubenswrapper[4759]: I1205 02:25:45.733945 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436"} Dec 05 02:26:14 crc kubenswrapper[4759]: I1205 02:26:14.083932 4759 generic.go:334] "Generic (PLEG): container finished" podID="703704f3-2e29-4eed-8943-3a34a004d8fc" containerID="3b21db198fa4b81f66dfa74870cce7ade64754b740837c0178dd2c4b0fdd390b" exitCode=0 Dec 05 02:26:14 crc kubenswrapper[4759]: I1205 02:26:14.084129 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"703704f3-2e29-4eed-8943-3a34a004d8fc","Type":"ContainerDied","Data":"3b21db198fa4b81f66dfa74870cce7ade64754b740837c0178dd2c4b0fdd390b"} Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.593612 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.652869 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.652920 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653095 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653118 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653142 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653255 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653286 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653340 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653374 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqwx9\" (UniqueName: \"kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9\") pod \"703704f3-2e29-4eed-8943-3a34a004d8fc\" (UID: \"703704f3-2e29-4eed-8943-3a34a004d8fc\") " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.653985 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.654228 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data" (OuterVolumeSpecName: "config-data") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.662014 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.674736 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9" (OuterVolumeSpecName: "kube-api-access-hqwx9") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "kube-api-access-hqwx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.687170 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.699470 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.710150 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.710279 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.738536 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "703704f3-2e29-4eed-8943-3a34a004d8fc" (UID: "703704f3-2e29-4eed-8943-3a34a004d8fc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.756784 4759 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757093 4759 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757114 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757137 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqwx9\" (UniqueName: \"kubernetes.io/projected/703704f3-2e29-4eed-8943-3a34a004d8fc-kube-api-access-hqwx9\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757148 4759 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/703704f3-2e29-4eed-8943-3a34a004d8fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757156 4759 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757177 4759 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757185 4759 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/703704f3-2e29-4eed-8943-3a34a004d8fc-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.757194 4759 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/703704f3-2e29-4eed-8943-3a34a004d8fc-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.797402 4759 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 02:26:15 crc kubenswrapper[4759]: I1205 02:26:15.859178 4759 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 02:26:16 crc kubenswrapper[4759]: I1205 02:26:16.108754 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"703704f3-2e29-4eed-8943-3a34a004d8fc","Type":"ContainerDied","Data":"74d240ae37d1d13fe81f8ad63f62dd76aedd4e64c1aab300c00e38fc2954dcc0"} Dec 05 02:26:16 crc kubenswrapper[4759]: I1205 02:26:16.109007 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d240ae37d1d13fe81f8ad63f62dd76aedd4e64c1aab300c00e38fc2954dcc0" Dec 05 02:26:16 crc kubenswrapper[4759]: I1205 02:26:16.108872 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.490360 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 02:26:27 crc kubenswrapper[4759]: E1205 02:26:27.492735 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703704f3-2e29-4eed-8943-3a34a004d8fc" containerName="tempest-tests-tempest-tests-runner" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.492851 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="703704f3-2e29-4eed-8943-3a34a004d8fc" containerName="tempest-tests-tempest-tests-runner" Dec 05 02:26:27 crc kubenswrapper[4759]: E1205 02:26:27.492944 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="extract-content" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.493024 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="extract-content" Dec 05 02:26:27 crc kubenswrapper[4759]: E1205 02:26:27.493115 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="registry-server" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.493189 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="registry-server" Dec 05 02:26:27 crc kubenswrapper[4759]: E1205 02:26:27.493275 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="extract-utilities" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.493412 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="extract-utilities" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.493774 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1fbbde-43c1-4f5f-8f9b-fba059d9a90e" containerName="registry-server" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.493899 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="703704f3-2e29-4eed-8943-3a34a004d8fc" containerName="tempest-tests-tempest-tests-runner" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.494979 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.497685 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m5tbt" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.501884 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.581730 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.581772 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rngl4\" (UniqueName: \"kubernetes.io/projected/98168e43-7dcd-4145-b010-078c0d190596-kube-api-access-rngl4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.683999 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.684060 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rngl4\" (UniqueName: \"kubernetes.io/projected/98168e43-7dcd-4145-b010-078c0d190596-kube-api-access-rngl4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.684948 4759 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.705597 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rngl4\" (UniqueName: \"kubernetes.io/projected/98168e43-7dcd-4145-b010-078c0d190596-kube-api-access-rngl4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.718735 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"98168e43-7dcd-4145-b010-078c0d190596\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:27 crc kubenswrapper[4759]: I1205 02:26:27.831906 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 02:26:28 crc kubenswrapper[4759]: W1205 02:26:28.392083 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98168e43_7dcd_4145_b010_078c0d190596.slice/crio-ce9c438b6f59294a93a7a33e40b494713e1862f8b84c1bb1302707a95b899c2a WatchSource:0}: Error finding container ce9c438b6f59294a93a7a33e40b494713e1862f8b84c1bb1302707a95b899c2a: Status 404 returned error can't find the container with id ce9c438b6f59294a93a7a33e40b494713e1862f8b84c1bb1302707a95b899c2a Dec 05 02:26:28 crc kubenswrapper[4759]: I1205 02:26:28.416066 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 02:26:29 crc kubenswrapper[4759]: I1205 02:26:29.281541 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"98168e43-7dcd-4145-b010-078c0d190596","Type":"ContainerStarted","Data":"ce9c438b6f59294a93a7a33e40b494713e1862f8b84c1bb1302707a95b899c2a"} Dec 05 02:26:30 crc kubenswrapper[4759]: I1205 02:26:30.295272 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"98168e43-7dcd-4145-b010-078c0d190596","Type":"ContainerStarted","Data":"e32c3d68cb7643a2d9a75340114c3e3d9a7d72ea46d3fb2f18a1795b5b5f9d6f"} Dec 05 02:26:30 crc kubenswrapper[4759]: I1205 02:26:30.319155 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.219538674 podStartE2EDuration="3.319100339s" podCreationTimestamp="2025-12-05 02:26:27 +0000 UTC" firstStartedPulling="2025-12-05 02:26:28.40158701 +0000 UTC m=+7407.617247960" lastFinishedPulling="2025-12-05 02:26:29.501148665 +0000 UTC m=+7408.716809625" observedRunningTime="2025-12-05 02:26:30.308253106 +0000 UTC m=+7409.523914076" watchObservedRunningTime="2025-12-05 02:26:30.319100339 +0000 UTC m=+7409.534761279" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.185772 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc8qp/must-gather-2kv9f"] Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.188519 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.194785 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dc8qp"/"kube-root-ca.crt" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.194844 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dc8qp"/"default-dockercfg-rc7p4" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.194788 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dc8qp"/"openshift-service-ca.crt" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.210525 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dc8qp/must-gather-2kv9f"] Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.354733 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rlb\" (UniqueName: \"kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.355187 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.457215 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rlb\" (UniqueName: \"kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.457467 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.457925 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.484895 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rlb\" (UniqueName: \"kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb\") pod \"must-gather-2kv9f\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:01 crc kubenswrapper[4759]: I1205 02:27:01.508341 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:27:02 crc kubenswrapper[4759]: I1205 02:27:02.012261 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dc8qp/must-gather-2kv9f"] Dec 05 02:27:02 crc kubenswrapper[4759]: I1205 02:27:02.728738 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" event={"ID":"16c9ee17-7873-4479-814f-7233e5be32c9","Type":"ContainerStarted","Data":"6c5bb08724042ff4bf3c56c44de80eb8d4ecfe045959c522edb5e39a40a0891f"} Dec 05 02:27:07 crc kubenswrapper[4759]: I1205 02:27:07.782144 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" event={"ID":"16c9ee17-7873-4479-814f-7233e5be32c9","Type":"ContainerStarted","Data":"233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171"} Dec 05 02:27:07 crc kubenswrapper[4759]: I1205 02:27:07.782784 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" event={"ID":"16c9ee17-7873-4479-814f-7233e5be32c9","Type":"ContainerStarted","Data":"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196"} Dec 05 02:27:07 crc kubenswrapper[4759]: I1205 02:27:07.827757 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" podStartSLOduration=1.867857442 podStartE2EDuration="6.827729978s" podCreationTimestamp="2025-12-05 02:27:01 +0000 UTC" firstStartedPulling="2025-12-05 02:27:02.020147706 +0000 UTC m=+7441.235808656" lastFinishedPulling="2025-12-05 02:27:06.980020242 +0000 UTC m=+7446.195681192" observedRunningTime="2025-12-05 02:27:07.802129898 +0000 UTC m=+7447.017790888" watchObservedRunningTime="2025-12-05 02:27:07.827729978 +0000 UTC m=+7447.043390938" Dec 05 02:27:11 crc kubenswrapper[4759]: E1205 02:27:11.083845 4759 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.150:42262->38.102.83.150:42699: read tcp 38.102.83.150:42262->38.102.83.150:42699: read: connection reset by peer Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.474418 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-pdxsq"] Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.476526 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.550450 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52zmb\" (UniqueName: \"kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.550497 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.652844 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52zmb\" (UniqueName: \"kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.652902 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.653326 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.693195 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52zmb\" (UniqueName: \"kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb\") pod \"crc-debug-pdxsq\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: I1205 02:27:12.794970 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:27:12 crc kubenswrapper[4759]: W1205 02:27:12.866675 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ee1e976_50a0_4b7e_87e2_ea56e2c11f51.slice/crio-debbacebfbd1d41f2ba1d19584908903b7443a475dd09c99667b292fd964adef WatchSource:0}: Error finding container debbacebfbd1d41f2ba1d19584908903b7443a475dd09c99667b292fd964adef: Status 404 returned error can't find the container with id debbacebfbd1d41f2ba1d19584908903b7443a475dd09c99667b292fd964adef Dec 05 02:27:13 crc kubenswrapper[4759]: I1205 02:27:13.866182 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" event={"ID":"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51","Type":"ContainerStarted","Data":"debbacebfbd1d41f2ba1d19584908903b7443a475dd09c99667b292fd964adef"} Dec 05 02:27:26 crc kubenswrapper[4759]: I1205 02:27:26.008328 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" event={"ID":"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51","Type":"ContainerStarted","Data":"374511b285a8ba1e5ecbe23b509c50843db296571f71ec5fa982be9dba719803"} Dec 05 02:27:26 crc kubenswrapper[4759]: I1205 02:27:26.030389 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" podStartSLOduration=1.6181684939999998 podStartE2EDuration="14.030288641s" podCreationTimestamp="2025-12-05 02:27:12 +0000 UTC" firstStartedPulling="2025-12-05 02:27:12.873433867 +0000 UTC m=+7452.089094827" lastFinishedPulling="2025-12-05 02:27:25.285554024 +0000 UTC m=+7464.501214974" observedRunningTime="2025-12-05 02:27:26.022520544 +0000 UTC m=+7465.238181494" watchObservedRunningTime="2025-12-05 02:27:26.030288641 +0000 UTC m=+7465.245949591" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.730149 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.735852 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.751275 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.852092 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqr9\" (UniqueName: \"kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.852201 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.852302 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.954318 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqr9\" (UniqueName: \"kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.954405 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.954478 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.955078 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.955206 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:02 crc kubenswrapper[4759]: I1205 02:28:02.976532 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqr9\" (UniqueName: \"kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9\") pod \"redhat-marketplace-4l7sr\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:03 crc kubenswrapper[4759]: I1205 02:28:03.073363 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:03 crc kubenswrapper[4759]: I1205 02:28:03.784841 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:04 crc kubenswrapper[4759]: I1205 02:28:04.432936 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:28:04 crc kubenswrapper[4759]: I1205 02:28:04.433583 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:28:04 crc kubenswrapper[4759]: I1205 02:28:04.434932 4759 generic.go:334] "Generic (PLEG): container finished" podID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerID="4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e" exitCode=0 Dec 05 02:28:04 crc kubenswrapper[4759]: I1205 02:28:04.435076 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerDied","Data":"4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e"} Dec 05 02:28:04 crc kubenswrapper[4759]: I1205 02:28:04.435865 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerStarted","Data":"b5cf438800c4b23457b65301462ad205d420a248976f66da96b0cea954ca442c"} Dec 05 02:28:05 crc kubenswrapper[4759]: I1205 02:28:05.447895 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerStarted","Data":"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6"} Dec 05 02:28:06 crc kubenswrapper[4759]: I1205 02:28:06.460225 4759 generic.go:334] "Generic (PLEG): container finished" podID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerID="6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6" exitCode=0 Dec 05 02:28:06 crc kubenswrapper[4759]: I1205 02:28:06.460351 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerDied","Data":"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6"} Dec 05 02:28:07 crc kubenswrapper[4759]: I1205 02:28:07.512086 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4l7sr" podStartSLOduration=2.726964622 podStartE2EDuration="5.512068069s" podCreationTimestamp="2025-12-05 02:28:02 +0000 UTC" firstStartedPulling="2025-12-05 02:28:04.437377157 +0000 UTC m=+7503.653038107" lastFinishedPulling="2025-12-05 02:28:07.222480604 +0000 UTC m=+7506.438141554" observedRunningTime="2025-12-05 02:28:07.502474796 +0000 UTC m=+7506.718135746" watchObservedRunningTime="2025-12-05 02:28:07.512068069 +0000 UTC m=+7506.727729019" Dec 05 02:28:08 crc kubenswrapper[4759]: I1205 02:28:08.495214 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerStarted","Data":"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6"} Dec 05 02:28:13 crc kubenswrapper[4759]: I1205 02:28:13.074037 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:13 crc kubenswrapper[4759]: I1205 02:28:13.074577 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:13 crc kubenswrapper[4759]: I1205 02:28:13.128693 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:13 crc kubenswrapper[4759]: I1205 02:28:13.610003 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:13 crc kubenswrapper[4759]: I1205 02:28:13.661593 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:15 crc kubenswrapper[4759]: I1205 02:28:15.573450 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4l7sr" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="registry-server" containerID="cri-o://ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6" gracePeriod=2 Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.158555 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.260825 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities\") pod \"5364bc55-17e5-479d-b75b-e6fce50d2b38\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.261321 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content\") pod \"5364bc55-17e5-479d-b75b-e6fce50d2b38\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.261382 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnqr9\" (UniqueName: \"kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9\") pod \"5364bc55-17e5-479d-b75b-e6fce50d2b38\" (UID: \"5364bc55-17e5-479d-b75b-e6fce50d2b38\") " Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.263661 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities" (OuterVolumeSpecName: "utilities") pod "5364bc55-17e5-479d-b75b-e6fce50d2b38" (UID: "5364bc55-17e5-479d-b75b-e6fce50d2b38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.268876 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9" (OuterVolumeSpecName: "kube-api-access-vnqr9") pod "5364bc55-17e5-479d-b75b-e6fce50d2b38" (UID: "5364bc55-17e5-479d-b75b-e6fce50d2b38"). InnerVolumeSpecName "kube-api-access-vnqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.280296 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5364bc55-17e5-479d-b75b-e6fce50d2b38" (UID: "5364bc55-17e5-479d-b75b-e6fce50d2b38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.364928 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnqr9\" (UniqueName: \"kubernetes.io/projected/5364bc55-17e5-479d-b75b-e6fce50d2b38-kube-api-access-vnqr9\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.364980 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.364992 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5364bc55-17e5-479d-b75b-e6fce50d2b38-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.589964 4759 generic.go:334] "Generic (PLEG): container finished" podID="9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" containerID="374511b285a8ba1e5ecbe23b509c50843db296571f71ec5fa982be9dba719803" exitCode=0 Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.590297 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" event={"ID":"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51","Type":"ContainerDied","Data":"374511b285a8ba1e5ecbe23b509c50843db296571f71ec5fa982be9dba719803"} Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.594609 4759 generic.go:334] "Generic (PLEG): container finished" podID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerID="ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6" exitCode=0 Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.594670 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerDied","Data":"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6"} Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.594718 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4l7sr" event={"ID":"5364bc55-17e5-479d-b75b-e6fce50d2b38","Type":"ContainerDied","Data":"b5cf438800c4b23457b65301462ad205d420a248976f66da96b0cea954ca442c"} Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.594739 4759 scope.go:117] "RemoveContainer" containerID="ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.595593 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4l7sr" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.627763 4759 scope.go:117] "RemoveContainer" containerID="6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.656358 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.677738 4759 scope.go:117] "RemoveContainer" containerID="4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.678246 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4l7sr"] Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.713682 4759 scope.go:117] "RemoveContainer" containerID="ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6" Dec 05 02:28:16 crc kubenswrapper[4759]: E1205 02:28:16.714257 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6\": container with ID starting with ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6 not found: ID does not exist" containerID="ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.714389 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6"} err="failed to get container status \"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6\": rpc error: code = NotFound desc = could not find container \"ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6\": container with ID starting with ceba81f8b756a3268f536ef52e4599dcc365209c608c089816b3af781f20bbe6 not found: ID does not exist" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.714491 4759 scope.go:117] "RemoveContainer" containerID="6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6" Dec 05 02:28:16 crc kubenswrapper[4759]: E1205 02:28:16.715060 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6\": container with ID starting with 6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6 not found: ID does not exist" containerID="6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.715102 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6"} err="failed to get container status \"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6\": rpc error: code = NotFound desc = could not find container \"6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6\": container with ID starting with 6554a2bad35a55b16c0cc1e5cbe14a062078013711e3fc8342d2d8b91c45eea6 not found: ID does not exist" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.715130 4759 scope.go:117] "RemoveContainer" containerID="4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e" Dec 05 02:28:16 crc kubenswrapper[4759]: E1205 02:28:16.715577 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e\": container with ID starting with 4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e not found: ID does not exist" containerID="4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e" Dec 05 02:28:16 crc kubenswrapper[4759]: I1205 02:28:16.715611 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e"} err="failed to get container status \"4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e\": rpc error: code = NotFound desc = could not find container \"4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e\": container with ID starting with 4001b879ee11f206a7c22615bcad3d80cc053c9884f1c917fcdbb1569506a37e not found: ID does not exist" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.169691 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" path="/var/lib/kubelet/pods/5364bc55-17e5-479d-b75b-e6fce50d2b38/volumes" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.744017 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.794913 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-pdxsq"] Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.802642 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52zmb\" (UniqueName: \"kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb\") pod \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.802720 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host\") pod \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\" (UID: \"9ee1e976-50a0-4b7e-87e2-ea56e2c11f51\") " Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.804268 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host" (OuterVolumeSpecName: "host") pod "9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" (UID: "9ee1e976-50a0-4b7e-87e2-ea56e2c11f51"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.807025 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-pdxsq"] Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.811653 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb" (OuterVolumeSpecName: "kube-api-access-52zmb") pod "9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" (UID: "9ee1e976-50a0-4b7e-87e2-ea56e2c11f51"). InnerVolumeSpecName "kube-api-access-52zmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.906169 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52zmb\" (UniqueName: \"kubernetes.io/projected/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-kube-api-access-52zmb\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:17 crc kubenswrapper[4759]: I1205 02:28:17.906235 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:18 crc kubenswrapper[4759]: I1205 02:28:18.617814 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="debbacebfbd1d41f2ba1d19584908903b7443a475dd09c99667b292fd964adef" Dec 05 02:28:18 crc kubenswrapper[4759]: I1205 02:28:18.617870 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-pdxsq" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.013280 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-7qh92"] Dec 05 02:28:19 crc kubenswrapper[4759]: E1205 02:28:19.013878 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="extract-content" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.013897 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="extract-content" Dec 05 02:28:19 crc kubenswrapper[4759]: E1205 02:28:19.013958 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" containerName="container-00" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.013971 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" containerName="container-00" Dec 05 02:28:19 crc kubenswrapper[4759]: E1205 02:28:19.013989 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="registry-server" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.013999 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="registry-server" Dec 05 02:28:19 crc kubenswrapper[4759]: E1205 02:28:19.014011 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="extract-utilities" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.014020 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="extract-utilities" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.014279 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" containerName="container-00" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.014332 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5364bc55-17e5-479d-b75b-e6fce50d2b38" containerName="registry-server" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.015477 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.133985 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgs7\" (UniqueName: \"kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.134424 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.171728 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee1e976-50a0-4b7e-87e2-ea56e2c11f51" path="/var/lib/kubelet/pods/9ee1e976-50a0-4b7e-87e2-ea56e2c11f51/volumes" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.237719 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgs7\" (UniqueName: \"kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.237965 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.238149 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.293794 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgs7\" (UniqueName: \"kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7\") pod \"crc-debug-7qh92\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.351328 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:19 crc kubenswrapper[4759]: I1205 02:28:19.629383 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" event={"ID":"1df391fa-af95-437f-9257-46b3f3650b22","Type":"ContainerStarted","Data":"ff2329ba6a7866ab10d5a97f7af07c3636a5a757af2970a8aa1d0c5a3be6b810"} Dec 05 02:28:20 crc kubenswrapper[4759]: I1205 02:28:20.642292 4759 generic.go:334] "Generic (PLEG): container finished" podID="1df391fa-af95-437f-9257-46b3f3650b22" containerID="0f6e91f02b08a51a366d8f5326b3a7a8731af2887f01a4ebe7ba161db5ae6e3f" exitCode=0 Dec 05 02:28:20 crc kubenswrapper[4759]: I1205 02:28:20.642356 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" event={"ID":"1df391fa-af95-437f-9257-46b3f3650b22","Type":"ContainerDied","Data":"0f6e91f02b08a51a366d8f5326b3a7a8731af2887f01a4ebe7ba161db5ae6e3f"} Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.766823 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.910074 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host\") pod \"1df391fa-af95-437f-9257-46b3f3650b22\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.910179 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host" (OuterVolumeSpecName: "host") pod "1df391fa-af95-437f-9257-46b3f3650b22" (UID: "1df391fa-af95-437f-9257-46b3f3650b22"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.913582 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsgs7\" (UniqueName: \"kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7\") pod \"1df391fa-af95-437f-9257-46b3f3650b22\" (UID: \"1df391fa-af95-437f-9257-46b3f3650b22\") " Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.915522 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1df391fa-af95-437f-9257-46b3f3650b22-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:21 crc kubenswrapper[4759]: I1205 02:28:21.919643 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7" (OuterVolumeSpecName: "kube-api-access-dsgs7") pod "1df391fa-af95-437f-9257-46b3f3650b22" (UID: "1df391fa-af95-437f-9257-46b3f3650b22"). InnerVolumeSpecName "kube-api-access-dsgs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.017008 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsgs7\" (UniqueName: \"kubernetes.io/projected/1df391fa-af95-437f-9257-46b3f3650b22-kube-api-access-dsgs7\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.662351 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" event={"ID":"1df391fa-af95-437f-9257-46b3f3650b22","Type":"ContainerDied","Data":"ff2329ba6a7866ab10d5a97f7af07c3636a5a757af2970a8aa1d0c5a3be6b810"} Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.662392 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff2329ba6a7866ab10d5a97f7af07c3636a5a757af2970a8aa1d0c5a3be6b810" Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.662720 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-7qh92" Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.979103 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-7qh92"] Dec 05 02:28:22 crc kubenswrapper[4759]: I1205 02:28:22.990015 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-7qh92"] Dec 05 02:28:23 crc kubenswrapper[4759]: I1205 02:28:23.176853 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df391fa-af95-437f-9257-46b3f3650b22" path="/var/lib/kubelet/pods/1df391fa-af95-437f-9257-46b3f3650b22/volumes" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.216835 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-cp6zl"] Dec 05 02:28:24 crc kubenswrapper[4759]: E1205 02:28:24.217727 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df391fa-af95-437f-9257-46b3f3650b22" containerName="container-00" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.217746 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df391fa-af95-437f-9257-46b3f3650b22" containerName="container-00" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.218042 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df391fa-af95-437f-9257-46b3f3650b22" containerName="container-00" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.218983 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.371683 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.371758 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sblpg\" (UniqueName: \"kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.474074 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.474181 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sblpg\" (UniqueName: \"kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.474252 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.498615 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sblpg\" (UniqueName: \"kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg\") pod \"crc-debug-cp6zl\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.543249 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:24 crc kubenswrapper[4759]: I1205 02:28:24.686205 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" event={"ID":"eb519788-fd42-487a-8d5d-1c3d34c3697b","Type":"ContainerStarted","Data":"2904cdf9ca1ef85371a2697ba2e572b515b13f51983806cad2ee9e65e1b0565b"} Dec 05 02:28:25 crc kubenswrapper[4759]: I1205 02:28:25.710953 4759 generic.go:334] "Generic (PLEG): container finished" podID="eb519788-fd42-487a-8d5d-1c3d34c3697b" containerID="48afc4d1b4700843c869b26babea375c909ca540fbdfce540e0016f219806c81" exitCode=0 Dec 05 02:28:25 crc kubenswrapper[4759]: I1205 02:28:25.711058 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" event={"ID":"eb519788-fd42-487a-8d5d-1c3d34c3697b","Type":"ContainerDied","Data":"48afc4d1b4700843c869b26babea375c909ca540fbdfce540e0016f219806c81"} Dec 05 02:28:25 crc kubenswrapper[4759]: I1205 02:28:25.860013 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-cp6zl"] Dec 05 02:28:25 crc kubenswrapper[4759]: I1205 02:28:25.874071 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc8qp/crc-debug-cp6zl"] Dec 05 02:28:26 crc kubenswrapper[4759]: I1205 02:28:26.887890 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.047617 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host\") pod \"eb519788-fd42-487a-8d5d-1c3d34c3697b\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.047723 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host" (OuterVolumeSpecName: "host") pod "eb519788-fd42-487a-8d5d-1c3d34c3697b" (UID: "eb519788-fd42-487a-8d5d-1c3d34c3697b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.048625 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sblpg\" (UniqueName: \"kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg\") pod \"eb519788-fd42-487a-8d5d-1c3d34c3697b\" (UID: \"eb519788-fd42-487a-8d5d-1c3d34c3697b\") " Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.049879 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb519788-fd42-487a-8d5d-1c3d34c3697b-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.055245 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg" (OuterVolumeSpecName: "kube-api-access-sblpg") pod "eb519788-fd42-487a-8d5d-1c3d34c3697b" (UID: "eb519788-fd42-487a-8d5d-1c3d34c3697b"). InnerVolumeSpecName "kube-api-access-sblpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.151621 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sblpg\" (UniqueName: \"kubernetes.io/projected/eb519788-fd42-487a-8d5d-1c3d34c3697b-kube-api-access-sblpg\") on node \"crc\" DevicePath \"\"" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.173459 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb519788-fd42-487a-8d5d-1c3d34c3697b" path="/var/lib/kubelet/pods/eb519788-fd42-487a-8d5d-1c3d34c3697b/volumes" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.760045 4759 scope.go:117] "RemoveContainer" containerID="48afc4d1b4700843c869b26babea375c909ca540fbdfce540e0016f219806c81" Dec 05 02:28:27 crc kubenswrapper[4759]: I1205 02:28:27.760188 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/crc-debug-cp6zl" Dec 05 02:28:34 crc kubenswrapper[4759]: I1205 02:28:34.432875 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:28:34 crc kubenswrapper[4759]: I1205 02:28:34.434703 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:29:04 crc kubenswrapper[4759]: I1205 02:29:04.433556 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:29:04 crc kubenswrapper[4759]: I1205 02:29:04.434508 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:29:04 crc kubenswrapper[4759]: I1205 02:29:04.434589 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:29:04 crc kubenswrapper[4759]: I1205 02:29:04.436060 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:29:04 crc kubenswrapper[4759]: I1205 02:29:04.436187 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436" gracePeriod=600 Dec 05 02:29:05 crc kubenswrapper[4759]: I1205 02:29:05.231464 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436" exitCode=0 Dec 05 02:29:05 crc kubenswrapper[4759]: I1205 02:29:05.231669 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436"} Dec 05 02:29:05 crc kubenswrapper[4759]: I1205 02:29:05.231697 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362"} Dec 05 02:29:05 crc kubenswrapper[4759]: I1205 02:29:05.231727 4759 scope.go:117] "RemoveContainer" containerID="b4e2ba6c22d1fb6ebf73668dd7fe9badac35a4b69ea7453a3930f10bfb2d3854" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.381783 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-evaluator/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.413724 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-api/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.601643 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-notifier/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.613316 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-listener/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.725051 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bb856b8d-7psj7_52bf4fd7-6aa6-4bdf-b8ac-60c071d42455/barbican-api/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.864066 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bb856b8d-7psj7_52bf4fd7-6aa6-4bdf-b8ac-60c071d42455/barbican-api-log/0.log" Dec 05 02:29:11 crc kubenswrapper[4759]: I1205 02:29:11.934772 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8dd9dc58-8n9k2_0e7388a6-d295-4807-8ce7-1eeb7dc55707/barbican-keystone-listener/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.112499 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8dd9dc58-8n9k2_0e7388a6-d295-4807-8ce7-1eeb7dc55707/barbican-keystone-listener-log/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.167347 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77bd8fcb75-d6pnc_9502dfee-cb5d-44de-a549-4f0060d29d9b/barbican-worker/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.228582 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77bd8fcb75-d6pnc_9502dfee-cb5d-44de-a549-4f0060d29d9b/barbican-worker-log/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.366131 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp_9542247a-5527-4bd2-bc5d-8bd30be01c1d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.468326 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/ceilometer-central-agent/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.626658 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/proxy-httpd/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.639513 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/ceilometer-notification-agent/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.665815 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/sg-core/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.905473 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2_8d7edc63-8bf1-4356-bc8a-c719049e0cee/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:12 crc kubenswrapper[4759]: I1205 02:29:12.906017 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd_76e169bb-796f-43c4-a487-36cf0c3d13a0/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.379392 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8ba6782e-a35c-4c30-ae5f-5efb85cc001c/cinder-api/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.383152 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8ba6782e-a35c-4c30-ae5f-5efb85cc001c/cinder-api-log/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.581414 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3772ce5b-f22d-4f9a-ad46-66923fae82be/cinder-backup/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.641233 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3772ce5b-f22d-4f9a-ad46-66923fae82be/probe/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.708767 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_84365c40-0d24-43ab-b5d1-66c9531bb860/cinder-scheduler/0.log" Dec 05 02:29:13 crc kubenswrapper[4759]: I1205 02:29:13.845329 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_84365c40-0d24-43ab-b5d1-66c9531bb860/probe/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.005547 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dbf1c346-6958-4849-8773-9d7b42b2c6fd/cinder-volume/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.098936 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dbf1c346-6958-4849-8773-9d7b42b2c6fd/probe/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.234967 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr_fef9f327-ed85-42f2-a400-624e7c84374b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.316769 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7_3da88e6e-b264-4624-a173-5dd09edf5066/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.456649 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/init/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.683482 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/init/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.743181 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/dnsmasq-dns/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.799171 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e29f7592-d4e5-4a46-bcdd-b52666d8e689/glance-httpd/0.log" Dec 05 02:29:14 crc kubenswrapper[4759]: I1205 02:29:14.875299 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e29f7592-d4e5-4a46-bcdd-b52666d8e689/glance-log/0.log" Dec 05 02:29:15 crc kubenswrapper[4759]: I1205 02:29:15.009887 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c44ace8b-2e47-4682-bcea-3626f840d31b/glance-httpd/0.log" Dec 05 02:29:15 crc kubenswrapper[4759]: I1205 02:29:15.018323 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c44ace8b-2e47-4682-bcea-3626f840d31b/glance-log/0.log" Dec 05 02:29:15 crc kubenswrapper[4759]: I1205 02:29:15.661194 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-85bff75774-kcbvj_75ee8344-adba-4c6d-83a2-52e1e8ce15e7/heat-engine/0.log" Dec 05 02:29:15 crc kubenswrapper[4759]: I1205 02:29:15.876823 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dc499994d-vp27z_81556b05-cd4e-407a-830f-e7e38962d519/horizon/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.169032 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k_361afa29-23e5-45e4-8c9d-c7da34c4b1ac/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.412917 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-n44mt_a0d1528b-9aba-49f6-982a-c0dc44cec8a8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.532385 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7666bd695c-zdxtn_93df44aa-16c3-4374-a75a-440bc03ba2cd/heat-api/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.639162 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dc499994d-vp27z_81556b05-cd4e-407a-830f-e7e38962d519/horizon-log/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.663222 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5fd8599d54-rjclc_c6dff63f-0a5b-4f52-a373-39a85e77df1e/heat-cfnapi/0.log" Dec 05 02:29:16 crc kubenswrapper[4759]: I1205 02:29:16.884240 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414941-zc2s9_411ce212-9655-4a4e-8056-adcbaf433178/keystone-cron/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.032772 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415001-qdv8x_f40db81b-0573-4c38-9382-5c23ef6cde76/keystone-cron/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.079632 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b7c787945-xkhnb_b37f0510-4911-4842-866a-863c4ac7e7c9/keystone-api/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.172594 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_43572d36-66d8-45df-9976-33f0b1e313f9/kube-state-metrics/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.386582 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q_b68949f6-0ba5-476a-8ff1-9b4247fe99e8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.442286 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-k248z_3e40e39a-7038-4839-9104-6cf64842c4a7/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.629371 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_1acbfd13-8d88-4169-b1f6-098a33b9cc15/manila-api-log/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.691555 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_1acbfd13-8d88-4169-b1f6-098a33b9cc15/manila-api/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.859646 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_20807b16-d503-447f-84ca-43f49c001c0c/manila-scheduler/0.log" Dec 05 02:29:17 crc kubenswrapper[4759]: I1205 02:29:17.866991 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_20807b16-d503-447f-84ca-43f49c001c0c/probe/0.log" Dec 05 02:29:18 crc kubenswrapper[4759]: I1205 02:29:18.389655 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a8b83f4d-0c22-48a4-b589-109ff6a5e8e2/manila-share/0.log" Dec 05 02:29:18 crc kubenswrapper[4759]: I1205 02:29:18.401194 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a8b83f4d-0c22-48a4-b589-109ff6a5e8e2/probe/0.log" Dec 05 02:29:18 crc kubenswrapper[4759]: I1205 02:29:18.570702 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2/mysqld-exporter/0.log" Dec 05 02:29:19 crc kubenswrapper[4759]: I1205 02:29:19.005423 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2_dad31a51-d010-4c0a-b52f-022acdb7d893/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:19 crc kubenswrapper[4759]: I1205 02:29:19.088806 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7677b6f8d5-zwkn7_b885b03c-f613-4c09-9ec3-8492c335923a/neutron-httpd/0.log" Dec 05 02:29:19 crc kubenswrapper[4759]: I1205 02:29:19.122375 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7677b6f8d5-zwkn7_b885b03c-f613-4c09-9ec3-8492c335923a/neutron-api/0.log" Dec 05 02:29:19 crc kubenswrapper[4759]: I1205 02:29:19.658211 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5ed8d71f-0481-4f8c-aed6-972efd952e3b/nova-cell0-conductor-conductor/0.log" Dec 05 02:29:19 crc kubenswrapper[4759]: I1205 02:29:19.952167 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8aa5c436-87aa-44d6-b16e-076b4cca0bd5/nova-cell1-conductor-conductor/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.000593 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a886f187-7e44-44b0-8dd6-030df520def9/nova-api-log/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.310336 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7229698f-fca8-46ac-b297-59fd47d15e13/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.323204 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j_3fac4138-c163-4a29-b1b0-b78285e908ec/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.639030 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a886f187-7e44-44b0-8dd6-030df520def9/nova-api-api/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.657904 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_372a0f97-53ca-477d-9202-5616650e4192/nova-metadata-log/0.log" Dec 05 02:29:20 crc kubenswrapper[4759]: I1205 02:29:20.959448 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/mysql-bootstrap/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.087512 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9/nova-scheduler-scheduler/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.166045 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/mysql-bootstrap/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.191548 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/galera/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.378479 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/mysql-bootstrap/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.588858 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/mysql-bootstrap/0.log" Dec 05 02:29:21 crc kubenswrapper[4759]: I1205 02:29:21.674327 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/galera/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.003098 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8effa6a4-dc68-4020-bd47-c83bcdc8d337/openstackclient/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.091741 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6qct9_0a66884c-5b7a-4462-8e9d-668a97883211/openstack-network-exporter/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.343469 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nw9xk_d4b47f07-88f8-4a9a-97ee-7c61be8a6235/ovn-controller/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.492370 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server-init/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.740498 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server-init/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.774738 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server/0.log" Dec 05 02:29:22 crc kubenswrapper[4759]: I1205 02:29:22.804447 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovs-vswitchd/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.011886 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qc2t8_5ec426df-120e-4f92-a1e3-3def5d61f3d3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.216578 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e813fe2b-c789-4c1a-89be-65e269dd6d17/openstack-network-exporter/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.217891 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e813fe2b-c789-4c1a-89be-65e269dd6d17/ovn-northd/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.428438 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a28d6f96-86fb-420c-a292-8c65e0088079/openstack-network-exporter/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.495622 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a28d6f96-86fb-420c-a292-8c65e0088079/ovsdbserver-nb/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.678097 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_42baeb94-be38-4927-bb0d-9b37877cf412/openstack-network-exporter/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.691433 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_42baeb94-be38-4927-bb0d-9b37877cf412/ovsdbserver-sb/0.log" Dec 05 02:29:23 crc kubenswrapper[4759]: I1205 02:29:23.772469 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_372a0f97-53ca-477d-9202-5616650e4192/nova-metadata-metadata/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.174996 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/init-config-reloader/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.274350 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bb6fdf748-gpknz_a2ebc8a7-dfee-4768-a3c2-976932027197/placement-api/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.314961 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bb6fdf748-gpknz_a2ebc8a7-dfee-4768-a3c2-976932027197/placement-log/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.411942 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/init-config-reloader/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.502103 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/config-reloader/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.529558 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/thanos-sidecar/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.578261 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/prometheus/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.776988 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/setup-container/0.log" Dec 05 02:29:24 crc kubenswrapper[4759]: I1205 02:29:24.960405 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/setup-container/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.067280 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/rabbitmq/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.082689 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/setup-container/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.457349 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/setup-container/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.473824 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/rabbitmq/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.592554 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw_81fe58fd-9cda-4705-9755-2b9bb62211f7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.725633 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x_77d4cfb2-ced1-4306-a020-5ea1a3ed597c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:25 crc kubenswrapper[4759]: I1205 02:29:25.845597 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-4n8lq_9b0d860b-da25-46cf-abf7-17755154fc43/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.024569 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x9fpm_cf4930f4-da24-4012-b8d1-1bcb0d5b0bef/ssh-known-hosts-edpm-deployment/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.267805 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8b7bf4bd7-qq45k_f625a19c-a9af-401d-a834-37a79e3dfeb4/proxy-server/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.489074 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wt66w_12e24711-58db-434b-97ed-5db25d183784/swift-ring-rebalance/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.512232 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-auditor/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.543296 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8b7bf4bd7-qq45k_f625a19c-a9af-401d-a834-37a79e3dfeb4/proxy-httpd/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.757369 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-auditor/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.774430 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-replicator/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.785208 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-server/0.log" Dec 05 02:29:26 crc kubenswrapper[4759]: I1205 02:29:26.787750 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-reaper/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.003606 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-server/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.009757 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-replicator/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.068295 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-updater/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.085431 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-auditor/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.231290 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-expirer/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.266449 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-replicator/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.349537 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-server/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.371057 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-updater/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.528532 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/rsync/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.539151 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/swift-recon-cron/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.692430 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6_74529ffd-281e-4f93-b8a1-fc858a1369c4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:27 crc kubenswrapper[4759]: I1205 02:29:27.896657 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92_ef0b2002-5521-4629-8083-fd25b382c0db/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:28 crc kubenswrapper[4759]: I1205 02:29:28.141786 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_98168e43-7dcd-4145-b010-078c0d190596/test-operator-logs-container/0.log" Dec 05 02:29:28 crc kubenswrapper[4759]: I1205 02:29:28.353257 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k_643a4a0e-1e9d-43a0-927c-ddb0778691f3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:29:28 crc kubenswrapper[4759]: I1205 02:29:28.968205 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_703704f3-2e29-4eed-8943-3a34a004d8fc/tempest-tests-tempest-tests-runner/0.log" Dec 05 02:29:39 crc kubenswrapper[4759]: I1205 02:29:39.825707 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cf79c940-d58e-4319-94e8-6bacc34b1ae5/memcached/0.log" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.202386 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv"] Dec 05 02:30:00 crc kubenswrapper[4759]: E1205 02:30:00.203688 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb519788-fd42-487a-8d5d-1c3d34c3697b" containerName="container-00" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.203713 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb519788-fd42-487a-8d5d-1c3d34c3697b" containerName="container-00" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.204113 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb519788-fd42-487a-8d5d-1c3d34c3697b" containerName="container-00" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.205192 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.208725 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.208726 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.213922 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv"] Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.330343 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.330534 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.330563 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxjl\" (UniqueName: \"kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.432683 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.433003 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxjl\" (UniqueName: \"kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.433162 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.434425 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.440547 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.464279 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxjl\" (UniqueName: \"kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl\") pod \"collect-profiles-29415030-wcrxv\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:00 crc kubenswrapper[4759]: I1205 02:30:00.531008 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:01 crc kubenswrapper[4759]: I1205 02:30:01.118520 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv"] Dec 05 02:30:01 crc kubenswrapper[4759]: I1205 02:30:01.903880 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" event={"ID":"88f573b8-0477-463d-919d-b7567cda9c42","Type":"ContainerStarted","Data":"4684ebd2f0f167bdcd717e1146ec3a46d7685617dcf3f24409b9dff705ed779a"} Dec 05 02:30:01 crc kubenswrapper[4759]: I1205 02:30:01.904449 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" event={"ID":"88f573b8-0477-463d-919d-b7567cda9c42","Type":"ContainerStarted","Data":"fbcb771fed750113d466985b7751cc6080bf04eae0c7f248cc1e24c361faa60b"} Dec 05 02:30:01 crc kubenswrapper[4759]: I1205 02:30:01.952399 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" podStartSLOduration=1.952340518 podStartE2EDuration="1.952340518s" podCreationTimestamp="2025-12-05 02:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 02:30:01.933669856 +0000 UTC m=+7621.149330806" watchObservedRunningTime="2025-12-05 02:30:01.952340518 +0000 UTC m=+7621.168001468" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.429120 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.615387 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.621253 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.706163 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.856283 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.858892 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.866802 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/extract/0.log" Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.914728 4759 generic.go:334] "Generic (PLEG): container finished" podID="88f573b8-0477-463d-919d-b7567cda9c42" containerID="4684ebd2f0f167bdcd717e1146ec3a46d7685617dcf3f24409b9dff705ed779a" exitCode=0 Dec 05 02:30:02 crc kubenswrapper[4759]: I1205 02:30:02.914771 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" event={"ID":"88f573b8-0477-463d-919d-b7567cda9c42","Type":"ContainerDied","Data":"4684ebd2f0f167bdcd717e1146ec3a46d7685617dcf3f24409b9dff705ed779a"} Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.062260 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ztxcj_5e28b15c-c39e-463a-b9a2-6f6df5addaf8/kube-rbac-proxy/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.119079 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ztxcj_5e28b15c-c39e-463a-b9a2-6f6df5addaf8/manager/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.260379 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zczgr_310627fe-09af-4a51-8312-e2b3841d6634/kube-rbac-proxy/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.326741 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zczgr_310627fe-09af-4a51-8312-e2b3841d6634/manager/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.414504 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6gh82_faf33139-8ab8-400c-8a2a-bf746d11f7e7/kube-rbac-proxy/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.554460 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6gh82_faf33139-8ab8-400c-8a2a-bf746d11f7e7/manager/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.607413 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-s6jzr_7377061e-a243-49b5-9728-4aaa2462445e/kube-rbac-proxy/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.705920 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-s6jzr_7377061e-a243-49b5-9728-4aaa2462445e/manager/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.822174 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lrxdn_f0212865-8c85-4b7c-855c-baa0fc705bf8/kube-rbac-proxy/0.log" Dec 05 02:30:03 crc kubenswrapper[4759]: I1205 02:30:03.920795 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lrxdn_f0212865-8c85-4b7c-855c-baa0fc705bf8/manager/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.046680 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jrlg9_a712ae8c-434d-43f7-ab4d-b385eee4eabf/kube-rbac-proxy/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.071764 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jrlg9_a712ae8c-434d-43f7-ab4d-b385eee4eabf/manager/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.322935 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9jfsw_1cce914e-3baa-4146-a52c-e054ee0c1eed/kube-rbac-proxy/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.450869 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9jfsw_1cce914e-3baa-4146-a52c-e054ee0c1eed/manager/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.481904 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-7s2r8_f07c10f0-5bec-4421-8ff0-2c659e42377b/kube-rbac-proxy/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.568829 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.627022 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfxjl\" (UniqueName: \"kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl\") pod \"88f573b8-0477-463d-919d-b7567cda9c42\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.627394 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume\") pod \"88f573b8-0477-463d-919d-b7567cda9c42\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.627531 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume\") pod \"88f573b8-0477-463d-919d-b7567cda9c42\" (UID: \"88f573b8-0477-463d-919d-b7567cda9c42\") " Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.628384 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume" (OuterVolumeSpecName: "config-volume") pod "88f573b8-0477-463d-919d-b7567cda9c42" (UID: "88f573b8-0477-463d-919d-b7567cda9c42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.630561 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-7s2r8_f07c10f0-5bec-4421-8ff0-2c659e42377b/manager/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.635954 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl" (OuterVolumeSpecName: "kube-api-access-pfxjl") pod "88f573b8-0477-463d-919d-b7567cda9c42" (UID: "88f573b8-0477-463d-919d-b7567cda9c42"). InnerVolumeSpecName "kube-api-access-pfxjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.673712 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88f573b8-0477-463d-919d-b7567cda9c42" (UID: "88f573b8-0477-463d-919d-b7567cda9c42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.731785 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfxjl\" (UniqueName: \"kubernetes.io/projected/88f573b8-0477-463d-919d-b7567cda9c42-kube-api-access-pfxjl\") on node \"crc\" DevicePath \"\"" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.731823 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88f573b8-0477-463d-919d-b7567cda9c42-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.731834 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88f573b8-0477-463d-919d-b7567cda9c42-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.753265 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5h7t4_1626dead-b9fd-4fae-af93-e2332112626f/manager/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.753476 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5h7t4_1626dead-b9fd-4fae-af93-e2332112626f/kube-rbac-proxy/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.893142 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zh2wq_785b512b-7fa8-4480-b042-3811f10e3659/kube-rbac-proxy/0.log" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.944453 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" event={"ID":"88f573b8-0477-463d-919d-b7567cda9c42","Type":"ContainerDied","Data":"fbcb771fed750113d466985b7751cc6080bf04eae0c7f248cc1e24c361faa60b"} Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.944493 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcb771fed750113d466985b7751cc6080bf04eae0c7f248cc1e24c361faa60b" Dec 05 02:30:04 crc kubenswrapper[4759]: I1205 02:30:04.944522 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415030-wcrxv" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.002240 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zh2wq_785b512b-7fa8-4480-b042-3811f10e3659/manager/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.123373 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ckxbs_cbc0cab7-b730-4ada-994d-eb8ae2e014df/kube-rbac-proxy/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.180345 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ckxbs_cbc0cab7-b730-4ada-994d-eb8ae2e014df/manager/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.306050 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_2dcdddec-138e-46fd-ab1d-15e4c4a06a15/kube-rbac-proxy/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.426837 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_2dcdddec-138e-46fd-ab1d-15e4c4a06a15/manager/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.473635 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-dcs2p_976da4b0-9b83-4ffe-9cf2-a07c3e149e04/kube-rbac-proxy/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.562589 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-dcs2p_976da4b0-9b83-4ffe-9cf2-a07c3e149e04/manager/0.log" Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.750057 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78"] Dec 05 02:30:05 crc kubenswrapper[4759]: I1205 02:30:05.766516 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414985-t7g78"] Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.079679 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dwt5n_9894d9a4-5121-4345-ab9c-4f770f4e4bb0/kube-rbac-proxy/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.082527 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dwt5n_9894d9a4-5121-4345-ab9c-4f770f4e4bb0/manager/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.273259 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g_33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9/kube-rbac-proxy/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.276871 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g_33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9/manager/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.615562 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lcpcr_627231bc-7c87-4c95-9a7e-ca5c295bfc69/registry-server/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.752498 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-j587r_0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da/kube-rbac-proxy/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.787899 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c496d6cb7-z59q2_c02805c1-2950-4e50-9163-a3ca8d5c4319/operator/0.log" Dec 05 02:30:06 crc kubenswrapper[4759]: I1205 02:30:06.963423 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-j587r_0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da/manager/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.009415 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-f9b75_f35362c5-4886-42dc-a633-c018e7f6aaf2/kube-rbac-proxy/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.071768 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-f9b75_f35362c5-4886-42dc-a633-c018e7f6aaf2/manager/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.182711 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e95e408-7164-432e-b96c-2ab6bc0d859a" path="/var/lib/kubelet/pods/0e95e408-7164-432e-b96c-2ab6bc0d859a/volumes" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.270978 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-85hsm_d832c1ee-6d66-4cd7-87eb-dc2d34f801cc/operator/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.411519 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qxf6v_da012733-6903-4607-9be5-17c81d20ae6b/kube-rbac-proxy/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.520849 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qxf6v_da012733-6903-4607-9be5-17c81d20ae6b/manager/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.574780 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6578c5f884-gml69_b6222e18-ffef-4dc6-b327-3b06bb91d75a/kube-rbac-proxy/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.782862 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8m9f7_617fefa5-c3f6-450e-a569-8ee3dd12f882/kube-rbac-proxy/0.log" Dec 05 02:30:07 crc kubenswrapper[4759]: I1205 02:30:07.882162 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8m9f7_617fefa5-c3f6-450e-a569-8ee3dd12f882/manager/0.log" Dec 05 02:30:08 crc kubenswrapper[4759]: I1205 02:30:08.045450 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-w42fg_e081130e-f14c-489c-9e4e-faab3dbdee6c/kube-rbac-proxy/0.log" Dec 05 02:30:08 crc kubenswrapper[4759]: I1205 02:30:08.069476 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6578c5f884-gml69_b6222e18-ffef-4dc6-b327-3b06bb91d75a/manager/0.log" Dec 05 02:30:08 crc kubenswrapper[4759]: I1205 02:30:08.148189 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-759bbb976c-dtqzv_4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba/manager/0.log" Dec 05 02:30:08 crc kubenswrapper[4759]: I1205 02:30:08.191387 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-w42fg_e081130e-f14c-489c-9e4e-faab3dbdee6c/manager/0.log" Dec 05 02:30:30 crc kubenswrapper[4759]: I1205 02:30:30.080041 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2zx9b_1a9ce926-4d8b-4608-9c75-9ddbc87a2464/control-plane-machine-set-operator/0.log" Dec 05 02:30:30 crc kubenswrapper[4759]: I1205 02:30:30.295460 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gng7x_f2b40743-a414-4dd8-9613-0bc14b937e3d/kube-rbac-proxy/0.log" Dec 05 02:30:30 crc kubenswrapper[4759]: I1205 02:30:30.313298 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gng7x_f2b40743-a414-4dd8-9613-0bc14b937e3d/machine-api-operator/0.log" Dec 05 02:30:38 crc kubenswrapper[4759]: I1205 02:30:38.817404 4759 scope.go:117] "RemoveContainer" containerID="747a7b77c7446bee1d5eb864b4b855b34f7dde5d790dbb2e750622dff03fc40d" Dec 05 02:30:46 crc kubenswrapper[4759]: I1205 02:30:46.083190 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4d8pk_a45999cd-2b52-4802-8bf1-98905eb68923/cert-manager-controller/0.log" Dec 05 02:30:46 crc kubenswrapper[4759]: I1205 02:30:46.280087 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-p4smd_a52b68b3-6d54-414e-8ab9-37788f4ec793/cert-manager-cainjector/0.log" Dec 05 02:30:46 crc kubenswrapper[4759]: I1205 02:30:46.281965 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-h8v7g_2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65/cert-manager-webhook/0.log" Dec 05 02:31:00 crc kubenswrapper[4759]: I1205 02:31:00.871633 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-srcnc_93f7aeec-1ff3-4cec-80b9-683bfda8584b/nmstate-console-plugin/0.log" Dec 05 02:31:01 crc kubenswrapper[4759]: I1205 02:31:01.098348 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nsbmm_1c4aa01f-df16-4f20-914f-1238c9c497ab/nmstate-handler/0.log" Dec 05 02:31:01 crc kubenswrapper[4759]: I1205 02:31:01.127100 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-hhc6d_463bc80d-5fb1-4bf0-b596-4f41571b3178/kube-rbac-proxy/0.log" Dec 05 02:31:01 crc kubenswrapper[4759]: I1205 02:31:01.219772 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-hhc6d_463bc80d-5fb1-4bf0-b596-4f41571b3178/nmstate-metrics/0.log" Dec 05 02:31:01 crc kubenswrapper[4759]: I1205 02:31:01.332511 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-w55p6_c0583a6d-7e56-455f-8557-f78732ffd0dc/nmstate-operator/0.log" Dec 05 02:31:01 crc kubenswrapper[4759]: I1205 02:31:01.429147 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9ztsr_05de94e3-b61f-4df3-a8f4-a0b97d65b575/nmstate-webhook/0.log" Dec 05 02:31:04 crc kubenswrapper[4759]: I1205 02:31:04.433381 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:31:04 crc kubenswrapper[4759]: I1205 02:31:04.433921 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.749272 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:07 crc kubenswrapper[4759]: E1205 02:31:07.751297 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f573b8-0477-463d-919d-b7567cda9c42" containerName="collect-profiles" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.751410 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f573b8-0477-463d-919d-b7567cda9c42" containerName="collect-profiles" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.751856 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f573b8-0477-463d-919d-b7567cda9c42" containerName="collect-profiles" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.786346 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.795875 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.921250 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.921489 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:07 crc kubenswrapper[4759]: I1205 02:31:07.921553 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbsjr\" (UniqueName: \"kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.023800 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.023847 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbsjr\" (UniqueName: \"kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.023967 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.024778 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.025813 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.047116 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbsjr\" (UniqueName: \"kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr\") pod \"redhat-operators-lxtxg\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.118181 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.638417 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:08 crc kubenswrapper[4759]: W1205 02:31:08.642524 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5285f1_0f7d_48d7_a4ba_21d8ca8c6421.slice/crio-b14af02fbf6131844ae8023f687cc4b7b5faf329b5c99c8b8429fa29c142c6fc WatchSource:0}: Error finding container b14af02fbf6131844ae8023f687cc4b7b5faf329b5c99c8b8429fa29c142c6fc: Status 404 returned error can't find the container with id b14af02fbf6131844ae8023f687cc4b7b5faf329b5c99c8b8429fa29c142c6fc Dec 05 02:31:08 crc kubenswrapper[4759]: I1205 02:31:08.698200 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerStarted","Data":"b14af02fbf6131844ae8023f687cc4b7b5faf329b5c99c8b8429fa29c142c6fc"} Dec 05 02:31:09 crc kubenswrapper[4759]: I1205 02:31:09.710481 4759 generic.go:334] "Generic (PLEG): container finished" podID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerID="17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1" exitCode=0 Dec 05 02:31:09 crc kubenswrapper[4759]: I1205 02:31:09.710567 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerDied","Data":"17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1"} Dec 05 02:31:09 crc kubenswrapper[4759]: I1205 02:31:09.713179 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:31:11 crc kubenswrapper[4759]: I1205 02:31:11.746814 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerStarted","Data":"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb"} Dec 05 02:31:14 crc kubenswrapper[4759]: I1205 02:31:14.777965 4759 generic.go:334] "Generic (PLEG): container finished" podID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerID="01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb" exitCode=0 Dec 05 02:31:14 crc kubenswrapper[4759]: I1205 02:31:14.778002 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerDied","Data":"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb"} Dec 05 02:31:15 crc kubenswrapper[4759]: I1205 02:31:15.791747 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerStarted","Data":"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066"} Dec 05 02:31:15 crc kubenswrapper[4759]: I1205 02:31:15.818683 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lxtxg" podStartSLOduration=3.205260719 podStartE2EDuration="8.818660218s" podCreationTimestamp="2025-12-05 02:31:07 +0000 UTC" firstStartedPulling="2025-12-05 02:31:09.712645201 +0000 UTC m=+7688.928306151" lastFinishedPulling="2025-12-05 02:31:15.32604471 +0000 UTC m=+7694.541705650" observedRunningTime="2025-12-05 02:31:15.810445881 +0000 UTC m=+7695.026106851" watchObservedRunningTime="2025-12-05 02:31:15.818660218 +0000 UTC m=+7695.034321168" Dec 05 02:31:16 crc kubenswrapper[4759]: I1205 02:31:16.516030 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/kube-rbac-proxy/0.log" Dec 05 02:31:16 crc kubenswrapper[4759]: I1205 02:31:16.606691 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/manager/0.log" Dec 05 02:31:18 crc kubenswrapper[4759]: I1205 02:31:18.119477 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:18 crc kubenswrapper[4759]: I1205 02:31:18.122214 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:19 crc kubenswrapper[4759]: I1205 02:31:19.170251 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lxtxg" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" probeResult="failure" output=< Dec 05 02:31:19 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:31:19 crc kubenswrapper[4759]: > Dec 05 02:31:29 crc kubenswrapper[4759]: I1205 02:31:29.185280 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lxtxg" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" probeResult="failure" output=< Dec 05 02:31:29 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:31:29 crc kubenswrapper[4759]: > Dec 05 02:31:33 crc kubenswrapper[4759]: I1205 02:31:33.531244 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-j6x8h_bd82eb36-66d9-4938-a5ab-29c36b1f482e/cluster-logging-operator/0.log" Dec 05 02:31:33 crc kubenswrapper[4759]: I1205 02:31:33.676693 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-dx9kr_35a9bf94-4e4a-4d68-95d8-1ae9421bb76f/collector/0.log" Dec 05 02:31:33 crc kubenswrapper[4759]: I1205 02:31:33.756694 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_d4684c5d-5bd7-4500-8f88-1778f47325c3/loki-compactor/0.log" Dec 05 02:31:33 crc kubenswrapper[4759]: I1205 02:31:33.865520 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-sfxqb_ec954d4c-6908-403f-8241-87a5191ddd17/loki-distributor/0.log" Dec 05 02:31:33 crc kubenswrapper[4759]: I1205 02:31:33.969532 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-cf7qk_e2fb0fbf-7c9c-4671-af48-6217b781c53d/gateway/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.055009 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-cf7qk_e2fb0fbf-7c9c-4671-af48-6217b781c53d/opa/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.137892 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-gczvw_26e7c80b-666f-472c-8fb4-d3349c69227e/gateway/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.205693 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-gczvw_26e7c80b-666f-472c-8fb4-d3349c69227e/opa/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.307074 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_c6186de1-0fbc-4432-8bb4-c95e25efe3a7/loki-index-gateway/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.432997 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.433077 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.482988 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_e0a9677f-60fe-4bcf-8262-250684b96537/loki-ingester/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.543079 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-plh5s_a02b1847-e805-40e3-bbfb-0585e864e6d0/loki-querier/0.log" Dec 05 02:31:34 crc kubenswrapper[4759]: I1205 02:31:34.658052 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-62g2t_f45b1eaf-54f2-400d-996e-95fbaff73750/loki-query-frontend/0.log" Dec 05 02:31:38 crc kubenswrapper[4759]: I1205 02:31:38.225736 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:38 crc kubenswrapper[4759]: I1205 02:31:38.292921 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:38 crc kubenswrapper[4759]: I1205 02:31:38.924583 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.075184 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lxtxg" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" containerID="cri-o://5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066" gracePeriod=2 Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.751549 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.778371 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities\") pod \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.778763 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content\") pod \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.779172 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities" (OuterVolumeSpecName: "utilities") pod "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" (UID: "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.779181 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsjr\" (UniqueName: \"kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr\") pod \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\" (UID: \"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421\") " Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.780488 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.810893 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr" (OuterVolumeSpecName: "kube-api-access-dbsjr") pod "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" (UID: "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421"). InnerVolumeSpecName "kube-api-access-dbsjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.883072 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsjr\" (UniqueName: \"kubernetes.io/projected/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-kube-api-access-dbsjr\") on node \"crc\" DevicePath \"\"" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.890873 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" (UID: "1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:31:40 crc kubenswrapper[4759]: I1205 02:31:40.985561 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.090346 4759 generic.go:334] "Generic (PLEG): container finished" podID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerID="5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066" exitCode=0 Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.090395 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerDied","Data":"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066"} Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.090429 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxtxg" event={"ID":"1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421","Type":"ContainerDied","Data":"b14af02fbf6131844ae8023f687cc4b7b5faf329b5c99c8b8429fa29c142c6fc"} Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.090444 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxtxg" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.090453 4759 scope.go:117] "RemoveContainer" containerID="5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.137176 4759 scope.go:117] "RemoveContainer" containerID="01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.172279 4759 scope.go:117] "RemoveContainer" containerID="17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.174809 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.184567 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lxtxg"] Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.232366 4759 scope.go:117] "RemoveContainer" containerID="5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066" Dec 05 02:31:41 crc kubenswrapper[4759]: E1205 02:31:41.238421 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066\": container with ID starting with 5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066 not found: ID does not exist" containerID="5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.238477 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066"} err="failed to get container status \"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066\": rpc error: code = NotFound desc = could not find container \"5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066\": container with ID starting with 5f657906b09e17c4bbb1be8c14b5e75f4665c84330fa200fe5d3ad0a01400066 not found: ID does not exist" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.238512 4759 scope.go:117] "RemoveContainer" containerID="01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb" Dec 05 02:31:41 crc kubenswrapper[4759]: E1205 02:31:41.239084 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb\": container with ID starting with 01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb not found: ID does not exist" containerID="01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.239129 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb"} err="failed to get container status \"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb\": rpc error: code = NotFound desc = could not find container \"01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb\": container with ID starting with 01bf9cced80c8b28ce5ac1c76cf6bf11a1a35f71c029ced7245976c996b169cb not found: ID does not exist" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.239160 4759 scope.go:117] "RemoveContainer" containerID="17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1" Dec 05 02:31:41 crc kubenswrapper[4759]: E1205 02:31:41.239802 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1\": container with ID starting with 17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1 not found: ID does not exist" containerID="17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1" Dec 05 02:31:41 crc kubenswrapper[4759]: I1205 02:31:41.239836 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1"} err="failed to get container status \"17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1\": rpc error: code = NotFound desc = could not find container \"17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1\": container with ID starting with 17bb059f9ca02c3b58c1c5d287275a2a4194e85698073abb0e27d5916d649af1 not found: ID does not exist" Dec 05 02:31:43 crc kubenswrapper[4759]: I1205 02:31:43.178126 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" path="/var/lib/kubelet/pods/1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421/volumes" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.434439 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dk9mn_78f52cea-319f-4493-aa77-b97f1fed1583/kube-rbac-proxy/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.512241 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dk9mn_78f52cea-319f-4493-aa77-b97f1fed1583/controller/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.603376 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.782733 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.801198 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.808154 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:31:51 crc kubenswrapper[4759]: I1205 02:31:51.818415 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.017562 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.023503 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.035130 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.067259 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.251875 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.252519 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.301173 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/controller/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.304418 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.464298 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/frr-metrics/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.500608 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/kube-rbac-proxy-frr/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.532555 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/kube-rbac-proxy/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.754397 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/reloader/0.log" Dec 05 02:31:52 crc kubenswrapper[4759]: I1205 02:31:52.809676 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7kpbw_b81e7b66-fed2-4b1e-8504-22a839862f14/frr-k8s-webhook-server/0.log" Dec 05 02:31:53 crc kubenswrapper[4759]: I1205 02:31:53.014642 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5544dd96f7-h9gmp_f5b08a58-e4f1-4520-aec9-e0f99e93e731/manager/0.log" Dec 05 02:31:53 crc kubenswrapper[4759]: I1205 02:31:53.209280 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6959c5664d-69r4h_161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc/webhook-server/0.log" Dec 05 02:31:53 crc kubenswrapper[4759]: I1205 02:31:53.328556 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wv8kr_b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51/kube-rbac-proxy/0.log" Dec 05 02:31:54 crc kubenswrapper[4759]: I1205 02:31:54.006820 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wv8kr_b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51/speaker/0.log" Dec 05 02:31:54 crc kubenswrapper[4759]: I1205 02:31:54.536558 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/frr/0.log" Dec 05 02:32:04 crc kubenswrapper[4759]: I1205 02:32:04.433161 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:32:04 crc kubenswrapper[4759]: I1205 02:32:04.434980 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:32:04 crc kubenswrapper[4759]: I1205 02:32:04.435136 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:32:04 crc kubenswrapper[4759]: I1205 02:32:04.436546 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:32:04 crc kubenswrapper[4759]: I1205 02:32:04.436723 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" gracePeriod=600 Dec 05 02:32:04 crc kubenswrapper[4759]: E1205 02:32:04.577450 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:32:05 crc kubenswrapper[4759]: I1205 02:32:05.353301 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" exitCode=0 Dec 05 02:32:05 crc kubenswrapper[4759]: I1205 02:32:05.353369 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362"} Dec 05 02:32:05 crc kubenswrapper[4759]: I1205 02:32:05.353773 4759 scope.go:117] "RemoveContainer" containerID="3ab5ccb875327560274e50ad18d4101cb298a901ea7a34a97c422033ec27e436" Dec 05 02:32:05 crc kubenswrapper[4759]: I1205 02:32:05.354745 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:32:05 crc kubenswrapper[4759]: E1205 02:32:05.355166 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.504376 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.660340 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.662632 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.683146 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.906841 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.941100 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:32:08 crc kubenswrapper[4759]: I1205 02:32:08.972784 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/extract/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.081619 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.303570 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.310534 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.332244 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.440286 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.462940 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.488594 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/extract/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.625525 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.795553 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.857069 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.869776 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:32:09 crc kubenswrapper[4759]: I1205 02:32:09.986570 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.035287 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/extract/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.035465 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.177402 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.387797 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.397363 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.401849 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.532666 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.566364 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/extract/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.572833 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.705450 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.880665 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.894445 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:32:10 crc kubenswrapper[4759]: I1205 02:32:10.939686 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.067793 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/extract/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.075657 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.099762 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.254248 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.480096 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.491657 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.539390 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.657523 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.696865 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:32:11 crc kubenswrapper[4759]: I1205 02:32:11.828613 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.001037 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.010217 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.054871 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.355531 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.432814 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.561034 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n7tt2_423b5d0f-1418-420b-80ca-f05d0087c85e/marketplace-operator/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.640694 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.688813 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/registry-server/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.942519 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.942525 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:32:12 crc kubenswrapper[4759]: I1205 02:32:12.969025 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.161965 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.217706 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.376615 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.501558 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/registry-server/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.594378 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/registry-server/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.631365 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.676051 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.680628 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.812902 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:32:13 crc kubenswrapper[4759]: I1205 02:32:13.814713 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:32:14 crc kubenswrapper[4759]: I1205 02:32:14.779980 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/registry-server/0.log" Dec 05 02:32:20 crc kubenswrapper[4759]: I1205 02:32:20.156597 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:32:20 crc kubenswrapper[4759]: E1205 02:32:20.157684 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.225516 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:21 crc kubenswrapper[4759]: E1205 02:32:21.226551 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="extract-content" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.226570 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="extract-content" Dec 05 02:32:21 crc kubenswrapper[4759]: E1205 02:32:21.226579 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.226586 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" Dec 05 02:32:21 crc kubenswrapper[4759]: E1205 02:32:21.226595 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="extract-utilities" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.226602 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="extract-utilities" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.226945 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5285f1-0f7d-48d7-a4ba-21d8ca8c6421" containerName="registry-server" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.232469 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.248734 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.368998 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzqb\" (UniqueName: \"kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.369197 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.369533 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.471802 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.471930 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.472014 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzqb\" (UniqueName: \"kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.472798 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.472809 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.492860 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzqb\" (UniqueName: \"kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb\") pod \"certified-operators-759qm\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:21 crc kubenswrapper[4759]: I1205 02:32:21.564288 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:22 crc kubenswrapper[4759]: I1205 02:32:22.122022 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:22 crc kubenswrapper[4759]: I1205 02:32:22.556665 4759 generic.go:334] "Generic (PLEG): container finished" podID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerID="19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e" exitCode=0 Dec 05 02:32:22 crc kubenswrapper[4759]: I1205 02:32:22.556718 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerDied","Data":"19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e"} Dec 05 02:32:22 crc kubenswrapper[4759]: I1205 02:32:22.556751 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerStarted","Data":"5121dc66f84c11b15d7094da844254456dd48df19cf5da325c67e386d02b6913"} Dec 05 02:32:24 crc kubenswrapper[4759]: I1205 02:32:24.581610 4759 generic.go:334] "Generic (PLEG): container finished" podID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerID="8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae" exitCode=0 Dec 05 02:32:24 crc kubenswrapper[4759]: I1205 02:32:24.581664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerDied","Data":"8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae"} Dec 05 02:32:25 crc kubenswrapper[4759]: I1205 02:32:25.597664 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerStarted","Data":"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3"} Dec 05 02:32:25 crc kubenswrapper[4759]: I1205 02:32:25.623728 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-759qm" podStartSLOduration=2.239132289 podStartE2EDuration="4.623700917s" podCreationTimestamp="2025-12-05 02:32:21 +0000 UTC" firstStartedPulling="2025-12-05 02:32:22.559481789 +0000 UTC m=+7761.775142739" lastFinishedPulling="2025-12-05 02:32:24.944050417 +0000 UTC m=+7764.159711367" observedRunningTime="2025-12-05 02:32:25.617666282 +0000 UTC m=+7764.833327272" watchObservedRunningTime="2025-12-05 02:32:25.623700917 +0000 UTC m=+7764.839361867" Dec 05 02:32:29 crc kubenswrapper[4759]: I1205 02:32:29.298274 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-sc8sd_b21d3e23-0940-4825-801e-ae74255085bd/prometheus-operator/0.log" Dec 05 02:32:29 crc kubenswrapper[4759]: I1205 02:32:29.661501 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_94fbcc74-1faa-44a4-8ea9-36028cc96003/prometheus-operator-admission-webhook/0.log" Dec 05 02:32:29 crc kubenswrapper[4759]: I1205 02:32:29.722742 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-6pmzz_d87236a6-f3c6-470f-a197-05846a9b0c22/operator/0.log" Dec 05 02:32:30 crc kubenswrapper[4759]: I1205 02:32:30.189011 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-q8g69_3555f68d-fb68-4cf9-91e0-51cc25d2305c/observability-ui-dashboards/0.log" Dec 05 02:32:30 crc kubenswrapper[4759]: I1205 02:32:30.199069 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_38b9b1d9-e67c-4aad-a22a-496d348f5148/prometheus-operator-admission-webhook/0.log" Dec 05 02:32:30 crc kubenswrapper[4759]: I1205 02:32:30.240087 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-qp8vs_96032405-2b01-4177-895c-f26ca2d838a9/perses-operator/0.log" Dec 05 02:32:31 crc kubenswrapper[4759]: I1205 02:32:31.564515 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:31 crc kubenswrapper[4759]: I1205 02:32:31.564639 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:31 crc kubenswrapper[4759]: I1205 02:32:31.622908 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:31 crc kubenswrapper[4759]: I1205 02:32:31.721990 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:31 crc kubenswrapper[4759]: I1205 02:32:31.862972 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:32 crc kubenswrapper[4759]: I1205 02:32:32.156199 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:32:32 crc kubenswrapper[4759]: E1205 02:32:32.156570 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:32:33 crc kubenswrapper[4759]: I1205 02:32:33.690587 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-759qm" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="registry-server" containerID="cri-o://42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3" gracePeriod=2 Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.287556 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.448073 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities\") pod \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.448287 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content\") pod \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.448342 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzqb\" (UniqueName: \"kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb\") pod \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\" (UID: \"5421ae2c-2a34-48b2-b023-18f6d7ec056c\") " Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.449492 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities" (OuterVolumeSpecName: "utilities") pod "5421ae2c-2a34-48b2-b023-18f6d7ec056c" (UID: "5421ae2c-2a34-48b2-b023-18f6d7ec056c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.457606 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb" (OuterVolumeSpecName: "kube-api-access-gfzqb") pod "5421ae2c-2a34-48b2-b023-18f6d7ec056c" (UID: "5421ae2c-2a34-48b2-b023-18f6d7ec056c"). InnerVolumeSpecName "kube-api-access-gfzqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.500206 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5421ae2c-2a34-48b2-b023-18f6d7ec056c" (UID: "5421ae2c-2a34-48b2-b023-18f6d7ec056c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.551161 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.551196 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5421ae2c-2a34-48b2-b023-18f6d7ec056c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.551207 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzqb\" (UniqueName: \"kubernetes.io/projected/5421ae2c-2a34-48b2-b023-18f6d7ec056c-kube-api-access-gfzqb\") on node \"crc\" DevicePath \"\"" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.707932 4759 generic.go:334] "Generic (PLEG): container finished" podID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerID="42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3" exitCode=0 Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.707996 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerDied","Data":"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3"} Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.708024 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-759qm" event={"ID":"5421ae2c-2a34-48b2-b023-18f6d7ec056c","Type":"ContainerDied","Data":"5121dc66f84c11b15d7094da844254456dd48df19cf5da325c67e386d02b6913"} Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.708039 4759 scope.go:117] "RemoveContainer" containerID="42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.708074 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-759qm" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.733106 4759 scope.go:117] "RemoveContainer" containerID="8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.769856 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.770541 4759 scope.go:117] "RemoveContainer" containerID="19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.783011 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-759qm"] Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.835606 4759 scope.go:117] "RemoveContainer" containerID="42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3" Dec 05 02:32:34 crc kubenswrapper[4759]: E1205 02:32:34.836191 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3\": container with ID starting with 42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3 not found: ID does not exist" containerID="42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.836238 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3"} err="failed to get container status \"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3\": rpc error: code = NotFound desc = could not find container \"42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3\": container with ID starting with 42152555acf50f99aaa806eaae36c3674b7ed074e58b648330200794fdfd29d3 not found: ID does not exist" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.836264 4759 scope.go:117] "RemoveContainer" containerID="8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae" Dec 05 02:32:34 crc kubenswrapper[4759]: E1205 02:32:34.836676 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae\": container with ID starting with 8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae not found: ID does not exist" containerID="8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.836700 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae"} err="failed to get container status \"8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae\": rpc error: code = NotFound desc = could not find container \"8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae\": container with ID starting with 8622e7ca6f36c07d31d07deabea15ca9291c717cf30c92d85f001624d1be72ae not found: ID does not exist" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.836718 4759 scope.go:117] "RemoveContainer" containerID="19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e" Dec 05 02:32:34 crc kubenswrapper[4759]: E1205 02:32:34.837087 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e\": container with ID starting with 19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e not found: ID does not exist" containerID="19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e" Dec 05 02:32:34 crc kubenswrapper[4759]: I1205 02:32:34.837109 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e"} err="failed to get container status \"19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e\": rpc error: code = NotFound desc = could not find container \"19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e\": container with ID starting with 19c35e3db57e1f0445dfcb5613be43bad8ab246ed495b00f2c52e6304c0e0d7e not found: ID does not exist" Dec 05 02:32:35 crc kubenswrapper[4759]: I1205 02:32:35.171525 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" path="/var/lib/kubelet/pods/5421ae2c-2a34-48b2-b023-18f6d7ec056c/volumes" Dec 05 02:32:46 crc kubenswrapper[4759]: I1205 02:32:46.929605 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/kube-rbac-proxy/0.log" Dec 05 02:32:47 crc kubenswrapper[4759]: I1205 02:32:47.019885 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/manager/0.log" Dec 05 02:32:47 crc kubenswrapper[4759]: I1205 02:32:47.155801 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:32:47 crc kubenswrapper[4759]: E1205 02:32:47.156063 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:33:01 crc kubenswrapper[4759]: I1205 02:33:01.172775 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:33:01 crc kubenswrapper[4759]: E1205 02:33:01.173678 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:33:09 crc kubenswrapper[4759]: E1205 02:33:09.089245 4759 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.150:48072->38.102.83.150:42699: write tcp 38.102.83.150:48072->38.102.83.150:42699: write: broken pipe Dec 05 02:33:15 crc kubenswrapper[4759]: I1205 02:33:15.155489 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:33:15 crc kubenswrapper[4759]: E1205 02:33:15.156249 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:33:27 crc kubenswrapper[4759]: I1205 02:33:27.158768 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:33:27 crc kubenswrapper[4759]: E1205 02:33:27.161369 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:33:39 crc kubenswrapper[4759]: I1205 02:33:39.011727 4759 scope.go:117] "RemoveContainer" containerID="374511b285a8ba1e5ecbe23b509c50843db296571f71ec5fa982be9dba719803" Dec 05 02:33:39 crc kubenswrapper[4759]: I1205 02:33:39.156624 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:33:39 crc kubenswrapper[4759]: E1205 02:33:39.157053 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:33:51 crc kubenswrapper[4759]: I1205 02:33:51.162457 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:33:51 crc kubenswrapper[4759]: E1205 02:33:51.163303 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:03 crc kubenswrapper[4759]: I1205 02:34:03.172794 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:34:03 crc kubenswrapper[4759]: E1205 02:34:03.174173 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:18 crc kubenswrapper[4759]: I1205 02:34:18.156110 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:34:18 crc kubenswrapper[4759]: E1205 02:34:18.158120 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:30 crc kubenswrapper[4759]: I1205 02:34:30.156553 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:34:30 crc kubenswrapper[4759]: E1205 02:34:30.157494 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:39 crc kubenswrapper[4759]: I1205 02:34:39.128401 4759 scope.go:117] "RemoveContainer" containerID="0f6e91f02b08a51a366d8f5326b3a7a8731af2887f01a4ebe7ba161db5ae6e3f" Dec 05 02:34:41 crc kubenswrapper[4759]: I1205 02:34:41.168252 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:34:41 crc kubenswrapper[4759]: E1205 02:34:41.169001 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:43 crc kubenswrapper[4759]: I1205 02:34:43.427493 4759 generic.go:334] "Generic (PLEG): container finished" podID="16c9ee17-7873-4479-814f-7233e5be32c9" containerID="237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196" exitCode=0 Dec 05 02:34:43 crc kubenswrapper[4759]: I1205 02:34:43.427587 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" event={"ID":"16c9ee17-7873-4479-814f-7233e5be32c9","Type":"ContainerDied","Data":"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196"} Dec 05 02:34:43 crc kubenswrapper[4759]: I1205 02:34:43.429635 4759 scope.go:117] "RemoveContainer" containerID="237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196" Dec 05 02:34:43 crc kubenswrapper[4759]: I1205 02:34:43.915882 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc8qp_must-gather-2kv9f_16c9ee17-7873-4479-814f-7233e5be32c9/gather/0.log" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.307415 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:34:50 crc kubenswrapper[4759]: E1205 02:34:50.308418 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="extract-content" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.308441 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="extract-content" Dec 05 02:34:50 crc kubenswrapper[4759]: E1205 02:34:50.308467 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="extract-utilities" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.308482 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="extract-utilities" Dec 05 02:34:50 crc kubenswrapper[4759]: E1205 02:34:50.308531 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="registry-server" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.308542 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="registry-server" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.308869 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5421ae2c-2a34-48b2-b023-18f6d7ec056c" containerName="registry-server" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.311365 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.319702 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.483417 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.483795 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.483848 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dgj\" (UniqueName: \"kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.586214 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.586357 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.586394 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dgj\" (UniqueName: \"kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.586880 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.587134 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.613460 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dgj\" (UniqueName: \"kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj\") pod \"community-operators-nfqhg\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:50 crc kubenswrapper[4759]: I1205 02:34:50.686097 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.246760 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.516119 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerStarted","Data":"4fcae7afa2f2cc36d72b9d7f62700a47101ac6b2dd980853b379ec1003782da5"} Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.516506 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerStarted","Data":"55784b94a5a9631740c9110e14b56f03c86ac20c457bd43bd2f3d95352469239"} Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.894372 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc8qp/must-gather-2kv9f"] Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.894903 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="copy" containerID="cri-o://233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171" gracePeriod=2 Dec 05 02:34:51 crc kubenswrapper[4759]: I1205 02:34:51.908187 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc8qp/must-gather-2kv9f"] Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.165588 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:34:52 crc kubenswrapper[4759]: E1205 02:34:52.167538 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.400334 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc8qp_must-gather-2kv9f_16c9ee17-7873-4479-814f-7233e5be32c9/copy/0.log" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.400812 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.546699 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5rlb\" (UniqueName: \"kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb\") pod \"16c9ee17-7873-4479-814f-7233e5be32c9\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.546941 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output\") pod \"16c9ee17-7873-4479-814f-7233e5be32c9\" (UID: \"16c9ee17-7873-4479-814f-7233e5be32c9\") " Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.556242 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb" (OuterVolumeSpecName: "kube-api-access-s5rlb") pod "16c9ee17-7873-4479-814f-7233e5be32c9" (UID: "16c9ee17-7873-4479-814f-7233e5be32c9"). InnerVolumeSpecName "kube-api-access-s5rlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.575429 4759 generic.go:334] "Generic (PLEG): container finished" podID="5c56a846-9f08-43c6-93b5-4747de05747a" containerID="4fcae7afa2f2cc36d72b9d7f62700a47101ac6b2dd980853b379ec1003782da5" exitCode=0 Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.575474 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerDied","Data":"4fcae7afa2f2cc36d72b9d7f62700a47101ac6b2dd980853b379ec1003782da5"} Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.579326 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc8qp_must-gather-2kv9f_16c9ee17-7873-4479-814f-7233e5be32c9/copy/0.log" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.579875 4759 generic.go:334] "Generic (PLEG): container finished" podID="16c9ee17-7873-4479-814f-7233e5be32c9" containerID="233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171" exitCode=143 Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.579944 4759 scope.go:117] "RemoveContainer" containerID="233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.580111 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc8qp/must-gather-2kv9f" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.633678 4759 scope.go:117] "RemoveContainer" containerID="237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.651918 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5rlb\" (UniqueName: \"kubernetes.io/projected/16c9ee17-7873-4479-814f-7233e5be32c9-kube-api-access-s5rlb\") on node \"crc\" DevicePath \"\"" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.676755 4759 scope.go:117] "RemoveContainer" containerID="233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171" Dec 05 02:34:52 crc kubenswrapper[4759]: E1205 02:34:52.677281 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171\": container with ID starting with 233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171 not found: ID does not exist" containerID="233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.677333 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171"} err="failed to get container status \"233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171\": rpc error: code = NotFound desc = could not find container \"233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171\": container with ID starting with 233009007e7eb288a181813cf14afea5888ba6e58a44d3cef5f87f4d8440a171 not found: ID does not exist" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.677358 4759 scope.go:117] "RemoveContainer" containerID="237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196" Dec 05 02:34:52 crc kubenswrapper[4759]: E1205 02:34:52.677728 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196\": container with ID starting with 237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196 not found: ID does not exist" containerID="237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.677800 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196"} err="failed to get container status \"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196\": rpc error: code = NotFound desc = could not find container \"237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196\": container with ID starting with 237cb5fffc13fe057d8ea7e0e9bf03bf1ae2f8674de2908b88d087d08bc8e196 not found: ID does not exist" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.795439 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "16c9ee17-7873-4479-814f-7233e5be32c9" (UID: "16c9ee17-7873-4479-814f-7233e5be32c9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:34:52 crc kubenswrapper[4759]: I1205 02:34:52.856526 4759 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/16c9ee17-7873-4479-814f-7233e5be32c9-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 02:34:53 crc kubenswrapper[4759]: I1205 02:34:53.168589 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" path="/var/lib/kubelet/pods/16c9ee17-7873-4479-814f-7233e5be32c9/volumes" Dec 05 02:34:53 crc kubenswrapper[4759]: I1205 02:34:53.596166 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerStarted","Data":"391e06aef8ef24563e61e93890a43c677cffbc29005e08441d48394544733e52"} Dec 05 02:34:54 crc kubenswrapper[4759]: I1205 02:34:54.613207 4759 generic.go:334] "Generic (PLEG): container finished" podID="5c56a846-9f08-43c6-93b5-4747de05747a" containerID="391e06aef8ef24563e61e93890a43c677cffbc29005e08441d48394544733e52" exitCode=0 Dec 05 02:34:54 crc kubenswrapper[4759]: I1205 02:34:54.613297 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerDied","Data":"391e06aef8ef24563e61e93890a43c677cffbc29005e08441d48394544733e52"} Dec 05 02:34:55 crc kubenswrapper[4759]: I1205 02:34:55.628953 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerStarted","Data":"0bffbc9cfcf23b7768e1e6e6963c46cc092cca762cc86069f0de1f16109d05c3"} Dec 05 02:34:55 crc kubenswrapper[4759]: I1205 02:34:55.651688 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nfqhg" podStartSLOduration=3.200366097 podStartE2EDuration="5.651652197s" podCreationTimestamp="2025-12-05 02:34:50 +0000 UTC" firstStartedPulling="2025-12-05 02:34:52.577940062 +0000 UTC m=+7911.793601012" lastFinishedPulling="2025-12-05 02:34:55.029226152 +0000 UTC m=+7914.244887112" observedRunningTime="2025-12-05 02:34:55.650611142 +0000 UTC m=+7914.866272102" watchObservedRunningTime="2025-12-05 02:34:55.651652197 +0000 UTC m=+7914.867313147" Dec 05 02:35:00 crc kubenswrapper[4759]: I1205 02:35:00.687792 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:00 crc kubenswrapper[4759]: I1205 02:35:00.688434 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:00 crc kubenswrapper[4759]: I1205 02:35:00.762523 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:01 crc kubenswrapper[4759]: I1205 02:35:01.758189 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:01 crc kubenswrapper[4759]: I1205 02:35:01.821569 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:35:03 crc kubenswrapper[4759]: I1205 02:35:03.715093 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nfqhg" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="registry-server" containerID="cri-o://0bffbc9cfcf23b7768e1e6e6963c46cc092cca762cc86069f0de1f16109d05c3" gracePeriod=2 Dec 05 02:35:04 crc kubenswrapper[4759]: I1205 02:35:04.155811 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:35:04 crc kubenswrapper[4759]: E1205 02:35:04.156055 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:35:04 crc kubenswrapper[4759]: I1205 02:35:04.733061 4759 generic.go:334] "Generic (PLEG): container finished" podID="5c56a846-9f08-43c6-93b5-4747de05747a" containerID="0bffbc9cfcf23b7768e1e6e6963c46cc092cca762cc86069f0de1f16109d05c3" exitCode=0 Dec 05 02:35:04 crc kubenswrapper[4759]: I1205 02:35:04.733165 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerDied","Data":"0bffbc9cfcf23b7768e1e6e6963c46cc092cca762cc86069f0de1f16109d05c3"} Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.095818 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.176336 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content\") pod \"5c56a846-9f08-43c6-93b5-4747de05747a\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.176385 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities\") pod \"5c56a846-9f08-43c6-93b5-4747de05747a\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.176618 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5dgj\" (UniqueName: \"kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj\") pod \"5c56a846-9f08-43c6-93b5-4747de05747a\" (UID: \"5c56a846-9f08-43c6-93b5-4747de05747a\") " Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.177012 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities" (OuterVolumeSpecName: "utilities") pod "5c56a846-9f08-43c6-93b5-4747de05747a" (UID: "5c56a846-9f08-43c6-93b5-4747de05747a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.177949 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.187695 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj" (OuterVolumeSpecName: "kube-api-access-l5dgj") pod "5c56a846-9f08-43c6-93b5-4747de05747a" (UID: "5c56a846-9f08-43c6-93b5-4747de05747a"). InnerVolumeSpecName "kube-api-access-l5dgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.248289 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c56a846-9f08-43c6-93b5-4747de05747a" (UID: "5c56a846-9f08-43c6-93b5-4747de05747a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.280734 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c56a846-9f08-43c6-93b5-4747de05747a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.280768 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5dgj\" (UniqueName: \"kubernetes.io/projected/5c56a846-9f08-43c6-93b5-4747de05747a-kube-api-access-l5dgj\") on node \"crc\" DevicePath \"\"" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.746546 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfqhg" event={"ID":"5c56a846-9f08-43c6-93b5-4747de05747a","Type":"ContainerDied","Data":"55784b94a5a9631740c9110e14b56f03c86ac20c457bd43bd2f3d95352469239"} Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.746580 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfqhg" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.746938 4759 scope.go:117] "RemoveContainer" containerID="0bffbc9cfcf23b7768e1e6e6963c46cc092cca762cc86069f0de1f16109d05c3" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.791167 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.794674 4759 scope.go:117] "RemoveContainer" containerID="391e06aef8ef24563e61e93890a43c677cffbc29005e08441d48394544733e52" Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.808041 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nfqhg"] Dec 05 02:35:05 crc kubenswrapper[4759]: I1205 02:35:05.822674 4759 scope.go:117] "RemoveContainer" containerID="4fcae7afa2f2cc36d72b9d7f62700a47101ac6b2dd980853b379ec1003782da5" Dec 05 02:35:07 crc kubenswrapper[4759]: I1205 02:35:07.171696 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" path="/var/lib/kubelet/pods/5c56a846-9f08-43c6-93b5-4747de05747a/volumes" Dec 05 02:35:19 crc kubenswrapper[4759]: I1205 02:35:19.158888 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:35:19 crc kubenswrapper[4759]: E1205 02:35:19.160233 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:35:33 crc kubenswrapper[4759]: I1205 02:35:33.156258 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:35:33 crc kubenswrapper[4759]: E1205 02:35:33.157418 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:35:45 crc kubenswrapper[4759]: I1205 02:35:45.156552 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:35:45 crc kubenswrapper[4759]: E1205 02:35:45.157653 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:35:59 crc kubenswrapper[4759]: I1205 02:35:59.156017 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:35:59 crc kubenswrapper[4759]: E1205 02:35:59.157033 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:36:11 crc kubenswrapper[4759]: I1205 02:36:11.171876 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:36:11 crc kubenswrapper[4759]: E1205 02:36:11.172804 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:36:24 crc kubenswrapper[4759]: I1205 02:36:24.155479 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:36:24 crc kubenswrapper[4759]: E1205 02:36:24.156411 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:36:37 crc kubenswrapper[4759]: I1205 02:36:37.156603 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:36:37 crc kubenswrapper[4759]: E1205 02:36:37.157702 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:36:51 crc kubenswrapper[4759]: I1205 02:36:51.170463 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:36:51 crc kubenswrapper[4759]: E1205 02:36:51.171447 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:37:06 crc kubenswrapper[4759]: I1205 02:37:06.156642 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:37:06 crc kubenswrapper[4759]: I1205 02:37:06.620223 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4"} Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.431926 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k9gnm/must-gather-ccsp9"] Dec 05 02:38:10 crc kubenswrapper[4759]: E1205 02:38:10.432936 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="copy" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.432950 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="copy" Dec 05 02:38:10 crc kubenswrapper[4759]: E1205 02:38:10.432963 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="extract-utilities" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.432969 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="extract-utilities" Dec 05 02:38:10 crc kubenswrapper[4759]: E1205 02:38:10.432983 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="registry-server" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.432989 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="registry-server" Dec 05 02:38:10 crc kubenswrapper[4759]: E1205 02:38:10.433022 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="gather" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.433027 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="gather" Dec 05 02:38:10 crc kubenswrapper[4759]: E1205 02:38:10.433041 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="extract-content" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.433048 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="extract-content" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.433260 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c56a846-9f08-43c6-93b5-4747de05747a" containerName="registry-server" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.433276 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="copy" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.433292 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c9ee17-7873-4479-814f-7233e5be32c9" containerName="gather" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.434512 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.443848 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k9gnm/must-gather-ccsp9"] Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.447483 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k9gnm"/"kube-root-ca.crt" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.447498 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k9gnm"/"openshift-service-ca.crt" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.590384 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxcw\" (UniqueName: \"kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.590739 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.692891 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.693046 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxcw\" (UniqueName: \"kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.693907 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.712690 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxcw\" (UniqueName: \"kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw\") pod \"must-gather-ccsp9\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:10 crc kubenswrapper[4759]: I1205 02:38:10.767036 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:38:11 crc kubenswrapper[4759]: I1205 02:38:11.331759 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k9gnm/must-gather-ccsp9"] Dec 05 02:38:11 crc kubenswrapper[4759]: W1205 02:38:11.339765 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f11f324_34a2_4cd0_ac98_bf89a7b0b33a.slice/crio-80812623089e36ee80356e6da2e77a57e93f1387ceaf0a2493d9fcb2dda3421e WatchSource:0}: Error finding container 80812623089e36ee80356e6da2e77a57e93f1387ceaf0a2493d9fcb2dda3421e: Status 404 returned error can't find the container with id 80812623089e36ee80356e6da2e77a57e93f1387ceaf0a2493d9fcb2dda3421e Dec 05 02:38:11 crc kubenswrapper[4759]: I1205 02:38:11.452027 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" event={"ID":"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a","Type":"ContainerStarted","Data":"80812623089e36ee80356e6da2e77a57e93f1387ceaf0a2493d9fcb2dda3421e"} Dec 05 02:38:12 crc kubenswrapper[4759]: I1205 02:38:12.467539 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" event={"ID":"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a","Type":"ContainerStarted","Data":"11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948"} Dec 05 02:38:12 crc kubenswrapper[4759]: I1205 02:38:12.468135 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" event={"ID":"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a","Type":"ContainerStarted","Data":"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399"} Dec 05 02:38:12 crc kubenswrapper[4759]: I1205 02:38:12.487388 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" podStartSLOduration=2.487347738 podStartE2EDuration="2.487347738s" podCreationTimestamp="2025-12-05 02:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 02:38:12.484250493 +0000 UTC m=+8111.699911463" watchObservedRunningTime="2025-12-05 02:38:12.487347738 +0000 UTC m=+8111.703008698" Dec 05 02:38:14 crc kubenswrapper[4759]: E1205 02:38:14.881559 4759 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.150:43322->38.102.83.150:42699: write tcp 38.102.83.150:43322->38.102.83.150:42699: write: broken pipe Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.269511 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-jd2nm"] Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.273947 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.279820 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k9gnm"/"default-dockercfg-65qwf" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.422199 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcnj2\" (UniqueName: \"kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.422410 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.524857 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.524978 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcnj2\" (UniqueName: \"kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.525767 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.545701 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcnj2\" (UniqueName: \"kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2\") pod \"crc-debug-jd2nm\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: I1205 02:38:16.591954 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:38:16 crc kubenswrapper[4759]: W1205 02:38:16.637967 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2cd4e4d_321d_4019_961d_2ce6865c47c4.slice/crio-457b39f3bffd54ff30c6ddad3419212b7ef2b99b738869794cf8720375a3c2a5 WatchSource:0}: Error finding container 457b39f3bffd54ff30c6ddad3419212b7ef2b99b738869794cf8720375a3c2a5: Status 404 returned error can't find the container with id 457b39f3bffd54ff30c6ddad3419212b7ef2b99b738869794cf8720375a3c2a5 Dec 05 02:38:17 crc kubenswrapper[4759]: I1205 02:38:17.511082 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" event={"ID":"b2cd4e4d-321d-4019-961d-2ce6865c47c4","Type":"ContainerStarted","Data":"bc862a70c98bd690b932c56f7267315f89f05a97a454eb6cce4cab16b7b19a59"} Dec 05 02:38:17 crc kubenswrapper[4759]: I1205 02:38:17.511571 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" event={"ID":"b2cd4e4d-321d-4019-961d-2ce6865c47c4","Type":"ContainerStarted","Data":"457b39f3bffd54ff30c6ddad3419212b7ef2b99b738869794cf8720375a3c2a5"} Dec 05 02:38:17 crc kubenswrapper[4759]: I1205 02:38:17.527524 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" podStartSLOduration=1.5275020019999999 podStartE2EDuration="1.527502002s" podCreationTimestamp="2025-12-05 02:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 02:38:17.527221245 +0000 UTC m=+8116.742882195" watchObservedRunningTime="2025-12-05 02:38:17.527502002 +0000 UTC m=+8116.743162952" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.674977 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.677733 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.690740 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.736247 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.736332 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmll9\" (UniqueName: \"kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.736519 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.838926 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.839068 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.839107 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmll9\" (UniqueName: \"kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.839537 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.839624 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:54 crc kubenswrapper[4759]: I1205 02:38:54.867568 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmll9\" (UniqueName: \"kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9\") pod \"redhat-marketplace-r5hwf\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:55 crc kubenswrapper[4759]: I1205 02:38:55.008572 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:38:56 crc kubenswrapper[4759]: I1205 02:38:56.139147 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:38:56 crc kubenswrapper[4759]: I1205 02:38:56.944744 4759 generic.go:334] "Generic (PLEG): container finished" podID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerID="54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1" exitCode=0 Dec 05 02:38:56 crc kubenswrapper[4759]: I1205 02:38:56.945141 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerDied","Data":"54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1"} Dec 05 02:38:56 crc kubenswrapper[4759]: I1205 02:38:56.945166 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerStarted","Data":"dbd51ca4aaab484f1a5342b2cbbd718a020543439f081ada21c3cb2d00639003"} Dec 05 02:38:56 crc kubenswrapper[4759]: I1205 02:38:56.949219 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:38:57 crc kubenswrapper[4759]: I1205 02:38:57.958439 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerStarted","Data":"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e"} Dec 05 02:38:58 crc kubenswrapper[4759]: I1205 02:38:58.976954 4759 generic.go:334] "Generic (PLEG): container finished" podID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerID="71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e" exitCode=0 Dec 05 02:38:58 crc kubenswrapper[4759]: I1205 02:38:58.977247 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerDied","Data":"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e"} Dec 05 02:39:00 crc kubenswrapper[4759]: I1205 02:39:00.090528 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerStarted","Data":"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6"} Dec 05 02:39:00 crc kubenswrapper[4759]: I1205 02:39:00.136638 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5hwf" podStartSLOduration=3.715575317 podStartE2EDuration="6.136618132s" podCreationTimestamp="2025-12-05 02:38:54 +0000 UTC" firstStartedPulling="2025-12-05 02:38:56.947941584 +0000 UTC m=+8156.163602524" lastFinishedPulling="2025-12-05 02:38:59.368984389 +0000 UTC m=+8158.584645339" observedRunningTime="2025-12-05 02:39:00.131685323 +0000 UTC m=+8159.347346273" watchObservedRunningTime="2025-12-05 02:39:00.136618132 +0000 UTC m=+8159.352279082" Dec 05 02:39:05 crc kubenswrapper[4759]: I1205 02:39:05.009437 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:05 crc kubenswrapper[4759]: I1205 02:39:05.010070 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:05 crc kubenswrapper[4759]: I1205 02:39:05.062251 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:05 crc kubenswrapper[4759]: I1205 02:39:05.207554 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:05 crc kubenswrapper[4759]: I1205 02:39:05.304146 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.169968 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r5hwf" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="registry-server" containerID="cri-o://55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6" gracePeriod=2 Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.782839 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.944905 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmll9\" (UniqueName: \"kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9\") pod \"4f9a1a20-fb50-4e69-aa17-193225ff1437\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.945366 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities\") pod \"4f9a1a20-fb50-4e69-aa17-193225ff1437\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.945489 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content\") pod \"4f9a1a20-fb50-4e69-aa17-193225ff1437\" (UID: \"4f9a1a20-fb50-4e69-aa17-193225ff1437\") " Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.946881 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities" (OuterVolumeSpecName: "utilities") pod "4f9a1a20-fb50-4e69-aa17-193225ff1437" (UID: "4f9a1a20-fb50-4e69-aa17-193225ff1437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.953288 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9" (OuterVolumeSpecName: "kube-api-access-zmll9") pod "4f9a1a20-fb50-4e69-aa17-193225ff1437" (UID: "4f9a1a20-fb50-4e69-aa17-193225ff1437"). InnerVolumeSpecName "kube-api-access-zmll9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:39:07 crc kubenswrapper[4759]: I1205 02:39:07.967729 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f9a1a20-fb50-4e69-aa17-193225ff1437" (UID: "4f9a1a20-fb50-4e69-aa17-193225ff1437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.048585 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmll9\" (UniqueName: \"kubernetes.io/projected/4f9a1a20-fb50-4e69-aa17-193225ff1437-kube-api-access-zmll9\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.048633 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.048647 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f9a1a20-fb50-4e69-aa17-193225ff1437-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.183104 4759 generic.go:334] "Generic (PLEG): container finished" podID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerID="55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6" exitCode=0 Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.183155 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerDied","Data":"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6"} Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.183209 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5hwf" event={"ID":"4f9a1a20-fb50-4e69-aa17-193225ff1437","Type":"ContainerDied","Data":"dbd51ca4aaab484f1a5342b2cbbd718a020543439f081ada21c3cb2d00639003"} Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.183230 4759 scope.go:117] "RemoveContainer" containerID="55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.183256 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5hwf" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.213830 4759 scope.go:117] "RemoveContainer" containerID="71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.233362 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.242968 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5hwf"] Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.268999 4759 scope.go:117] "RemoveContainer" containerID="54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.325276 4759 scope.go:117] "RemoveContainer" containerID="55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6" Dec 05 02:39:08 crc kubenswrapper[4759]: E1205 02:39:08.325796 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6\": container with ID starting with 55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6 not found: ID does not exist" containerID="55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.325840 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6"} err="failed to get container status \"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6\": rpc error: code = NotFound desc = could not find container \"55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6\": container with ID starting with 55d74686acf60843d97a5f91ea89bd449c9cbbe03938b4891e99e34d7e5952b6 not found: ID does not exist" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.325867 4759 scope.go:117] "RemoveContainer" containerID="71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e" Dec 05 02:39:08 crc kubenswrapper[4759]: E1205 02:39:08.326213 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e\": container with ID starting with 71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e not found: ID does not exist" containerID="71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.326252 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e"} err="failed to get container status \"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e\": rpc error: code = NotFound desc = could not find container \"71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e\": container with ID starting with 71e4c6405fab470b8673fb4f24d3d33005318b7f6074b1523d49d59baa4b728e not found: ID does not exist" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.326280 4759 scope.go:117] "RemoveContainer" containerID="54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1" Dec 05 02:39:08 crc kubenswrapper[4759]: E1205 02:39:08.326601 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1\": container with ID starting with 54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1 not found: ID does not exist" containerID="54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1" Dec 05 02:39:08 crc kubenswrapper[4759]: I1205 02:39:08.326628 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1"} err="failed to get container status \"54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1\": rpc error: code = NotFound desc = could not find container \"54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1\": container with ID starting with 54c87e06007496605406bbc2ae227fbd0879d52f82b766cd75e4bd4dea06aab1 not found: ID does not exist" Dec 05 02:39:09 crc kubenswrapper[4759]: I1205 02:39:09.177139 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" path="/var/lib/kubelet/pods/4f9a1a20-fb50-4e69-aa17-193225ff1437/volumes" Dec 05 02:39:11 crc kubenswrapper[4759]: I1205 02:39:11.219709 4759 generic.go:334] "Generic (PLEG): container finished" podID="b2cd4e4d-321d-4019-961d-2ce6865c47c4" containerID="bc862a70c98bd690b932c56f7267315f89f05a97a454eb6cce4cab16b7b19a59" exitCode=0 Dec 05 02:39:11 crc kubenswrapper[4759]: I1205 02:39:11.219836 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" event={"ID":"b2cd4e4d-321d-4019-961d-2ce6865c47c4","Type":"ContainerDied","Data":"bc862a70c98bd690b932c56f7267315f89f05a97a454eb6cce4cab16b7b19a59"} Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.378745 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.446369 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcnj2\" (UniqueName: \"kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2\") pod \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.446756 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-jd2nm"] Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.447004 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host\") pod \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\" (UID: \"b2cd4e4d-321d-4019-961d-2ce6865c47c4\") " Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.447079 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host" (OuterVolumeSpecName: "host") pod "b2cd4e4d-321d-4019-961d-2ce6865c47c4" (UID: "b2cd4e4d-321d-4019-961d-2ce6865c47c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.448200 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2cd4e4d-321d-4019-961d-2ce6865c47c4-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.452331 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2" (OuterVolumeSpecName: "kube-api-access-gcnj2") pod "b2cd4e4d-321d-4019-961d-2ce6865c47c4" (UID: "b2cd4e4d-321d-4019-961d-2ce6865c47c4"). InnerVolumeSpecName "kube-api-access-gcnj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.463633 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-jd2nm"] Dec 05 02:39:12 crc kubenswrapper[4759]: I1205 02:39:12.551000 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcnj2\" (UniqueName: \"kubernetes.io/projected/b2cd4e4d-321d-4019-961d-2ce6865c47c4-kube-api-access-gcnj2\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.169508 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2cd4e4d-321d-4019-961d-2ce6865c47c4" path="/var/lib/kubelet/pods/b2cd4e4d-321d-4019-961d-2ce6865c47c4/volumes" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.254534 4759 scope.go:117] "RemoveContainer" containerID="bc862a70c98bd690b932c56f7267315f89f05a97a454eb6cce4cab16b7b19a59" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.254700 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-jd2nm" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.678269 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-6kw9x"] Dec 05 02:39:13 crc kubenswrapper[4759]: E1205 02:39:13.678839 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="extract-content" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.678856 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="extract-content" Dec 05 02:39:13 crc kubenswrapper[4759]: E1205 02:39:13.678884 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="registry-server" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.678892 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="registry-server" Dec 05 02:39:13 crc kubenswrapper[4759]: E1205 02:39:13.678942 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="extract-utilities" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.678953 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="extract-utilities" Dec 05 02:39:13 crc kubenswrapper[4759]: E1205 02:39:13.678971 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cd4e4d-321d-4019-961d-2ce6865c47c4" containerName="container-00" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.678980 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cd4e4d-321d-4019-961d-2ce6865c47c4" containerName="container-00" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.679238 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9a1a20-fb50-4e69-aa17-193225ff1437" containerName="registry-server" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.679257 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cd4e4d-321d-4019-961d-2ce6865c47c4" containerName="container-00" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.680161 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.683713 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k9gnm"/"default-dockercfg-65qwf" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.778630 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.778769 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmblc\" (UniqueName: \"kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.881120 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmblc\" (UniqueName: \"kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.881340 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.881490 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:13 crc kubenswrapper[4759]: I1205 02:39:13.915735 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmblc\" (UniqueName: \"kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc\") pod \"crc-debug-6kw9x\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:14 crc kubenswrapper[4759]: I1205 02:39:14.005314 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:14 crc kubenswrapper[4759]: I1205 02:39:14.264253 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" event={"ID":"c758b676-2122-4dc0-a270-a7917c0abcbc","Type":"ContainerStarted","Data":"938df8f87fe52084b80cc4eb773788f825e9bd3e4cc76bf80dc1871ba589e139"} Dec 05 02:39:15 crc kubenswrapper[4759]: I1205 02:39:15.276612 4759 generic.go:334] "Generic (PLEG): container finished" podID="c758b676-2122-4dc0-a270-a7917c0abcbc" containerID="2ac2b28438c6cb82fd9f970f551981dd6199dc36de811ca0cb4e2489eae61285" exitCode=0 Dec 05 02:39:15 crc kubenswrapper[4759]: I1205 02:39:15.276655 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" event={"ID":"c758b676-2122-4dc0-a270-a7917c0abcbc","Type":"ContainerDied","Data":"2ac2b28438c6cb82fd9f970f551981dd6199dc36de811ca0cb4e2489eae61285"} Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.419976 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.537457 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmblc\" (UniqueName: \"kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc\") pod \"c758b676-2122-4dc0-a270-a7917c0abcbc\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.537625 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host\") pod \"c758b676-2122-4dc0-a270-a7917c0abcbc\" (UID: \"c758b676-2122-4dc0-a270-a7917c0abcbc\") " Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.537811 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host" (OuterVolumeSpecName: "host") pod "c758b676-2122-4dc0-a270-a7917c0abcbc" (UID: "c758b676-2122-4dc0-a270-a7917c0abcbc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.538361 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c758b676-2122-4dc0-a270-a7917c0abcbc-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.543797 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc" (OuterVolumeSpecName: "kube-api-access-wmblc") pod "c758b676-2122-4dc0-a270-a7917c0abcbc" (UID: "c758b676-2122-4dc0-a270-a7917c0abcbc"). InnerVolumeSpecName "kube-api-access-wmblc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:39:16 crc kubenswrapper[4759]: I1205 02:39:16.639870 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmblc\" (UniqueName: \"kubernetes.io/projected/c758b676-2122-4dc0-a270-a7917c0abcbc-kube-api-access-wmblc\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:17 crc kubenswrapper[4759]: I1205 02:39:17.304612 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" event={"ID":"c758b676-2122-4dc0-a270-a7917c0abcbc","Type":"ContainerDied","Data":"938df8f87fe52084b80cc4eb773788f825e9bd3e4cc76bf80dc1871ba589e139"} Dec 05 02:39:17 crc kubenswrapper[4759]: I1205 02:39:17.304656 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-6kw9x" Dec 05 02:39:17 crc kubenswrapper[4759]: I1205 02:39:17.305010 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="938df8f87fe52084b80cc4eb773788f825e9bd3e4cc76bf80dc1871ba589e139" Dec 05 02:39:17 crc kubenswrapper[4759]: I1205 02:39:17.430062 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-6kw9x"] Dec 05 02:39:17 crc kubenswrapper[4759]: I1205 02:39:17.442437 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-6kw9x"] Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.617413 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-pdsnc"] Dec 05 02:39:18 crc kubenswrapper[4759]: E1205 02:39:18.618096 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c758b676-2122-4dc0-a270-a7917c0abcbc" containerName="container-00" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.618110 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="c758b676-2122-4dc0-a270-a7917c0abcbc" containerName="container-00" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.618388 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="c758b676-2122-4dc0-a270-a7917c0abcbc" containerName="container-00" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.619500 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.623167 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k9gnm"/"default-dockercfg-65qwf" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.683496 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5jq\" (UniqueName: \"kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.683727 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.786209 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5jq\" (UniqueName: \"kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.786398 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.786576 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.807293 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5jq\" (UniqueName: \"kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq\") pod \"crc-debug-pdsnc\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:18 crc kubenswrapper[4759]: I1205 02:39:18.939175 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:19 crc kubenswrapper[4759]: I1205 02:39:19.176145 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c758b676-2122-4dc0-a270-a7917c0abcbc" path="/var/lib/kubelet/pods/c758b676-2122-4dc0-a270-a7917c0abcbc/volumes" Dec 05 02:39:19 crc kubenswrapper[4759]: I1205 02:39:19.325895 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" event={"ID":"1b895916-0cc6-4ef0-bc0b-8806cfec023a","Type":"ContainerStarted","Data":"c1cd0333bee82441583afc418910ec5feec7be556fadc4399541f2d995b907b5"} Dec 05 02:39:20 crc kubenswrapper[4759]: I1205 02:39:20.337501 4759 generic.go:334] "Generic (PLEG): container finished" podID="1b895916-0cc6-4ef0-bc0b-8806cfec023a" containerID="fb1a10ba6f74a74a17a643f56ac06b69f0b0469556d13a77a05ff79b964efa8d" exitCode=0 Dec 05 02:39:20 crc kubenswrapper[4759]: I1205 02:39:20.337601 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" event={"ID":"1b895916-0cc6-4ef0-bc0b-8806cfec023a","Type":"ContainerDied","Data":"fb1a10ba6f74a74a17a643f56ac06b69f0b0469556d13a77a05ff79b964efa8d"} Dec 05 02:39:20 crc kubenswrapper[4759]: I1205 02:39:20.385354 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-pdsnc"] Dec 05 02:39:20 crc kubenswrapper[4759]: I1205 02:39:20.397936 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k9gnm/crc-debug-pdsnc"] Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.488169 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.577703 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host\") pod \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.577843 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host" (OuterVolumeSpecName: "host") pod "1b895916-0cc6-4ef0-bc0b-8806cfec023a" (UID: "1b895916-0cc6-4ef0-bc0b-8806cfec023a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.578297 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq5jq\" (UniqueName: \"kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq\") pod \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\" (UID: \"1b895916-0cc6-4ef0-bc0b-8806cfec023a\") " Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.578803 4759 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b895916-0cc6-4ef0-bc0b-8806cfec023a-host\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.598086 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq" (OuterVolumeSpecName: "kube-api-access-pq5jq") pod "1b895916-0cc6-4ef0-bc0b-8806cfec023a" (UID: "1b895916-0cc6-4ef0-bc0b-8806cfec023a"). InnerVolumeSpecName "kube-api-access-pq5jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:39:21 crc kubenswrapper[4759]: I1205 02:39:21.681535 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq5jq\" (UniqueName: \"kubernetes.io/projected/1b895916-0cc6-4ef0-bc0b-8806cfec023a-kube-api-access-pq5jq\") on node \"crc\" DevicePath \"\"" Dec 05 02:39:22 crc kubenswrapper[4759]: I1205 02:39:22.359153 4759 scope.go:117] "RemoveContainer" containerID="fb1a10ba6f74a74a17a643f56ac06b69f0b0469556d13a77a05ff79b964efa8d" Dec 05 02:39:22 crc kubenswrapper[4759]: I1205 02:39:22.359322 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/crc-debug-pdsnc" Dec 05 02:39:23 crc kubenswrapper[4759]: I1205 02:39:23.177087 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b895916-0cc6-4ef0-bc0b-8806cfec023a" path="/var/lib/kubelet/pods/1b895916-0cc6-4ef0-bc0b-8806cfec023a/volumes" Dec 05 02:39:34 crc kubenswrapper[4759]: I1205 02:39:34.434104 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:39:34 crc kubenswrapper[4759]: I1205 02:39:34.435516 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:40:04 crc kubenswrapper[4759]: I1205 02:40:04.433395 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:40:04 crc kubenswrapper[4759]: I1205 02:40:04.433844 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.239236 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-api/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.335394 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-evaluator/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.385059 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-listener/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.455754 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_dab6929f-0e72-4f06-84cd-c3db7967578f/aodh-notifier/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.613033 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bb856b8d-7psj7_52bf4fd7-6aa6-4bdf-b8ac-60c071d42455/barbican-api/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.633967 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bb856b8d-7psj7_52bf4fd7-6aa6-4bdf-b8ac-60c071d42455/barbican-api-log/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.748915 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8dd9dc58-8n9k2_0e7388a6-d295-4807-8ce7-1eeb7dc55707/barbican-keystone-listener/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.945520 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8dd9dc58-8n9k2_0e7388a6-d295-4807-8ce7-1eeb7dc55707/barbican-keystone-listener-log/0.log" Dec 05 02:40:17 crc kubenswrapper[4759]: I1205 02:40:17.978312 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77bd8fcb75-d6pnc_9502dfee-cb5d-44de-a549-4f0060d29d9b/barbican-worker/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.018828 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-77bd8fcb75-d6pnc_9502dfee-cb5d-44de-a549-4f0060d29d9b/barbican-worker-log/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.452285 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-npsmp_9542247a-5527-4bd2-bc5d-8bd30be01c1d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.466771 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/ceilometer-central-agent/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.676660 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/ceilometer-notification-agent/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.696001 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/proxy-httpd/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.723232 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_507bfd65-c768-4dfb-9e1c-aed7bdf0ef55/sg-core/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.904178 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-d8mzd_76e169bb-796f-43c4-a487-36cf0c3d13a0/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:18 crc kubenswrapper[4759]: I1205 02:40:18.953874 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gxlh2_8d7edc63-8bf1-4356-bc8a-c719049e0cee/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.245983 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8ba6782e-a35c-4c30-ae5f-5efb85cc001c/cinder-api-log/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.292602 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8ba6782e-a35c-4c30-ae5f-5efb85cc001c/cinder-api/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.516568 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3772ce5b-f22d-4f9a-ad46-66923fae82be/cinder-backup/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.541279 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3772ce5b-f22d-4f9a-ad46-66923fae82be/probe/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.600154 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_84365c40-0d24-43ab-b5d1-66c9531bb860/cinder-scheduler/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.762686 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_84365c40-0d24-43ab-b5d1-66c9531bb860/probe/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.880736 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dbf1c346-6958-4849-8773-9d7b42b2c6fd/cinder-volume/0.log" Dec 05 02:40:19 crc kubenswrapper[4759]: I1205 02:40:19.883188 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_dbf1c346-6958-4849-8773-9d7b42b2c6fd/probe/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.039803 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xlpdr_fef9f327-ed85-42f2-a400-624e7c84374b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.203933 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p9hv7_3da88e6e-b264-4624-a173-5dd09edf5066/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.316189 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/init/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.520579 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/init/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.576560 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e29f7592-d4e5-4a46-bcdd-b52666d8e689/glance-httpd/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.605920 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c6cf8d999-rmpvx_afdea169-3f66-4ad6-be4e-755db23f6a50/dnsmasq-dns/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.607415 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e29f7592-d4e5-4a46-bcdd-b52666d8e689/glance-log/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.750261 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c44ace8b-2e47-4682-bcea-3626f840d31b/glance-httpd/0.log" Dec 05 02:40:20 crc kubenswrapper[4759]: I1205 02:40:20.766700 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c44ace8b-2e47-4682-bcea-3626f840d31b/glance-log/0.log" Dec 05 02:40:21 crc kubenswrapper[4759]: I1205 02:40:21.223133 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-85bff75774-kcbvj_75ee8344-adba-4c6d-83a2-52e1e8ce15e7/heat-engine/0.log" Dec 05 02:40:21 crc kubenswrapper[4759]: I1205 02:40:21.525800 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dc499994d-vp27z_81556b05-cd4e-407a-830f-e7e38962d519/horizon/0.log" Dec 05 02:40:21 crc kubenswrapper[4759]: I1205 02:40:21.706250 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vxz7k_361afa29-23e5-45e4-8c9d-c7da34c4b1ac/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:21 crc kubenswrapper[4759]: I1205 02:40:21.763075 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7666bd695c-zdxtn_93df44aa-16c3-4374-a75a-440bc03ba2cd/heat-api/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.087960 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5fd8599d54-rjclc_c6dff63f-0a5b-4f52-a373-39a85e77df1e/heat-cfnapi/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.129065 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dc499994d-vp27z_81556b05-cd4e-407a-830f-e7e38962d519/horizon-log/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.202006 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-n44mt_a0d1528b-9aba-49f6-982a-c0dc44cec8a8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.363106 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414941-zc2s9_411ce212-9655-4a4e-8056-adcbaf433178/keystone-cron/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.387954 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415001-qdv8x_f40db81b-0573-4c38-9382-5c23ef6cde76/keystone-cron/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.643810 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_43572d36-66d8-45df-9976-33f0b1e313f9/kube-state-metrics/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.737222 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b7c787945-xkhnb_b37f0510-4911-4842-866a-863c4ac7e7c9/keystone-api/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.768873 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-c9h4q_b68949f6-0ba5-476a-8ff1-9b4247fe99e8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.885344 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-k248z_3e40e39a-7038-4839-9104-6cf64842c4a7/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:22 crc kubenswrapper[4759]: I1205 02:40:22.971335 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_1acbfd13-8d88-4169-b1f6-098a33b9cc15/manila-api-log/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.077984 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_1acbfd13-8d88-4169-b1f6-098a33b9cc15/manila-api/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.196855 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_20807b16-d503-447f-84ca-43f49c001c0c/probe/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.222670 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_20807b16-d503-447f-84ca-43f49c001c0c/manila-scheduler/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.374174 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a8b83f4d-0c22-48a4-b589-109ff6a5e8e2/manila-share/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.441809 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a8b83f4d-0c22-48a4-b589-109ff6a5e8e2/probe/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.648024 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_ea73c43e-ad87-47bd-8bfa-96d9a2be5cf2/mysqld-exporter/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.948029 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7677b6f8d5-zwkn7_b885b03c-f613-4c09-9ec3-8492c335923a/neutron-httpd/0.log" Dec 05 02:40:23 crc kubenswrapper[4759]: I1205 02:40:23.972194 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ts2v2_dad31a51-d010-4c0a-b52f-022acdb7d893/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:24 crc kubenswrapper[4759]: I1205 02:40:24.084120 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7677b6f8d5-zwkn7_b885b03c-f613-4c09-9ec3-8492c335923a/neutron-api/0.log" Dec 05 02:40:24 crc kubenswrapper[4759]: I1205 02:40:24.800338 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5ed8d71f-0481-4f8c-aed6-972efd952e3b/nova-cell0-conductor-conductor/0.log" Dec 05 02:40:24 crc kubenswrapper[4759]: I1205 02:40:24.946128 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a886f187-7e44-44b0-8dd6-030df520def9/nova-api-log/0.log" Dec 05 02:40:25 crc kubenswrapper[4759]: I1205 02:40:25.096687 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8aa5c436-87aa-44d6-b16e-076b4cca0bd5/nova-cell1-conductor-conductor/0.log" Dec 05 02:40:25 crc kubenswrapper[4759]: I1205 02:40:25.371471 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7229698f-fca8-46ac-b297-59fd47d15e13/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 02:40:25 crc kubenswrapper[4759]: I1205 02:40:25.441538 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-7mf4j_3fac4138-c163-4a29-b1b0-b78285e908ec/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:25 crc kubenswrapper[4759]: I1205 02:40:25.693150 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a886f187-7e44-44b0-8dd6-030df520def9/nova-api-api/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.114647 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_372a0f97-53ca-477d-9202-5616650e4192/nova-metadata-log/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.418361 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f2e8a3ef-84d0-4e49-9f3d-a500ceefc2c9/nova-scheduler-scheduler/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.430474 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/mysql-bootstrap/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.637800 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/mysql-bootstrap/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.694390 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb3095df-1b95-485e-99b5-6a3886c58ac3/galera/0.log" Dec 05 02:40:26 crc kubenswrapper[4759]: I1205 02:40:26.838935 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/mysql-bootstrap/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.037550 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/galera/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.046617 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1013063a-9cdb-47ba-8c7d-5161bbbad9d4/mysql-bootstrap/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.215702 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8effa6a4-dc68-4020-bd47-c83bcdc8d337/openstackclient/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.358150 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6qct9_0a66884c-5b7a-4462-8e9d-668a97883211/openstack-network-exporter/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.538361 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nw9xk_d4b47f07-88f8-4a9a-97ee-7c61be8a6235/ovn-controller/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.661685 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server-init/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.930505 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server/0.log" Dec 05 02:40:27 crc kubenswrapper[4759]: I1205 02:40:27.973347 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovsdb-server-init/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.008533 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nscp4_87eff283-d384-4578-9a23-0d7dab551aab/ovs-vswitchd/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.216803 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qc2t8_5ec426df-120e-4f92-a1e3-3def5d61f3d3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.420765 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e813fe2b-c789-4c1a-89be-65e269dd6d17/openstack-network-exporter/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.445679 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e813fe2b-c789-4c1a-89be-65e269dd6d17/ovn-northd/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.629852 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a28d6f96-86fb-420c-a292-8c65e0088079/openstack-network-exporter/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.656554 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a28d6f96-86fb-420c-a292-8c65e0088079/ovsdbserver-nb/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.877875 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_42baeb94-be38-4927-bb0d-9b37877cf412/openstack-network-exporter/0.log" Dec 05 02:40:28 crc kubenswrapper[4759]: I1205 02:40:28.923663 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_42baeb94-be38-4927-bb0d-9b37877cf412/ovsdbserver-sb/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.248205 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_372a0f97-53ca-477d-9202-5616650e4192/nova-metadata-metadata/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.276657 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bb6fdf748-gpknz_a2ebc8a7-dfee-4768-a3c2-976932027197/placement-api/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.326012 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bb6fdf748-gpknz_a2ebc8a7-dfee-4768-a3c2-976932027197/placement-log/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.688187 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/init-config-reloader/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.881993 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/init-config-reloader/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.922514 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/prometheus/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.930194 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/thanos-sidecar/0.log" Dec 05 02:40:29 crc kubenswrapper[4759]: I1205 02:40:29.985041 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_df6b2930-dd32-49ef-a13e-2329eba38827/config-reloader/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.122160 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/setup-container/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.373618 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/setup-container/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.411414 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_975c9850-0dc7-4b43-a521-015930850b0b/rabbitmq/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.463569 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/setup-container/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.658980 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/setup-container/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.724660 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qt6qw_81fe58fd-9cda-4705-9755-2b9bb62211f7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:30 crc kubenswrapper[4759]: I1205 02:40:30.782250 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1cf2a4df-221d-4b0c-8a47-114deb1af60a/rabbitmq/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.002536 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5fn4x_77d4cfb2-ced1-4306-a020-5ea1a3ed597c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.017509 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-4n8lq_9b0d860b-da25-46cf-abf7-17755154fc43/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.207694 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x9fpm_cf4930f4-da24-4012-b8d1-1bcb0d5b0bef/ssh-known-hosts-edpm-deployment/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.506935 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8b7bf4bd7-qq45k_f625a19c-a9af-401d-a834-37a79e3dfeb4/proxy-server/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.553270 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wt66w_12e24711-58db-434b-97ed-5db25d183784/swift-ring-rebalance/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.699389 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8b7bf4bd7-qq45k_f625a19c-a9af-401d-a834-37a79e3dfeb4/proxy-httpd/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.741572 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-auditor/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.804530 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-reaper/0.log" Dec 05 02:40:31 crc kubenswrapper[4759]: I1205 02:40:31.988139 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-server/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.012156 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/account-replicator/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.043030 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-auditor/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.109682 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-replicator/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.186970 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-server/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.210215 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/container-updater/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.288247 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-auditor/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.339662 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-expirer/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.455209 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-server/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.477242 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-replicator/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.493029 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/object-updater/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.586093 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/rsync/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.708333 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_28edaf49-80c4-4732-a19f-1f2348fcd8e7/swift-recon-cron/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.787036 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nrwj6_74529ffd-281e-4f93-b8a1-fc858a1369c4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:32 crc kubenswrapper[4759]: I1205 02:40:32.966000 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-w6g92_ef0b2002-5521-4629-8083-fd25b382c0db/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:33 crc kubenswrapper[4759]: I1205 02:40:33.184838 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_98168e43-7dcd-4145-b010-078c0d190596/test-operator-logs-container/0.log" Dec 05 02:40:33 crc kubenswrapper[4759]: I1205 02:40:33.400854 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h6p2k_643a4a0e-1e9d-43a0-927c-ddb0778691f3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 02:40:33 crc kubenswrapper[4759]: I1205 02:40:33.970464 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_703704f3-2e29-4eed-8943-3a34a004d8fc/tempest-tests-tempest-tests-runner/0.log" Dec 05 02:40:34 crc kubenswrapper[4759]: I1205 02:40:34.435436 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:40:34 crc kubenswrapper[4759]: I1205 02:40:34.435708 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:40:34 crc kubenswrapper[4759]: I1205 02:40:34.435755 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:40:34 crc kubenswrapper[4759]: I1205 02:40:34.436564 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:40:34 crc kubenswrapper[4759]: I1205 02:40:34.436616 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4" gracePeriod=600 Dec 05 02:40:35 crc kubenswrapper[4759]: I1205 02:40:35.300048 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4" exitCode=0 Dec 05 02:40:35 crc kubenswrapper[4759]: I1205 02:40:35.301260 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4"} Dec 05 02:40:35 crc kubenswrapper[4759]: I1205 02:40:35.301436 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e"} Dec 05 02:40:35 crc kubenswrapper[4759]: I1205 02:40:35.301530 4759 scope.go:117] "RemoveContainer" containerID="c3aac024b5be78c2d7a82b609bf5c46da74e7bb4d003d2d7ef3aaa1da67ea362" Dec 05 02:40:45 crc kubenswrapper[4759]: I1205 02:40:45.916220 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cf79c940-d58e-4319-94e8-6bacc34b1ae5/memcached/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.175270 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.398700 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.444716 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.469421 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.696582 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/pull/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.729898 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/util/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.738776 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1a4842edb765f4af3437abd4d66fad3b33eb215dd656044bf1d3aa4714rx9n4_8fbf91d7-0670-400f-92fe-30da90f6e105/extract/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.878666 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ztxcj_5e28b15c-c39e-463a-b9a2-6f6df5addaf8/kube-rbac-proxy/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.966419 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-ztxcj_5e28b15c-c39e-463a-b9a2-6f6df5addaf8/manager/0.log" Dec 05 02:41:05 crc kubenswrapper[4759]: I1205 02:41:05.978568 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zczgr_310627fe-09af-4a51-8312-e2b3841d6634/kube-rbac-proxy/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.156285 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-zczgr_310627fe-09af-4a51-8312-e2b3841d6634/manager/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.196393 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6gh82_faf33139-8ab8-400c-8a2a-bf746d11f7e7/kube-rbac-proxy/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.294510 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-6gh82_faf33139-8ab8-400c-8a2a-bf746d11f7e7/manager/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.424080 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-s6jzr_7377061e-a243-49b5-9728-4aaa2462445e/kube-rbac-proxy/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.500674 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-s6jzr_7377061e-a243-49b5-9728-4aaa2462445e/manager/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.599323 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lrxdn_f0212865-8c85-4b7c-855c-baa0fc705bf8/kube-rbac-proxy/0.log" Dec 05 02:41:06 crc kubenswrapper[4759]: I1205 02:41:06.994943 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jrlg9_a712ae8c-434d-43f7-ab4d-b385eee4eabf/kube-rbac-proxy/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.162053 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jrlg9_a712ae8c-434d-43f7-ab4d-b385eee4eabf/manager/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.195402 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9jfsw_1cce914e-3baa-4146-a52c-e054ee0c1eed/kube-rbac-proxy/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.211129 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-lrxdn_f0212865-8c85-4b7c-855c-baa0fc705bf8/manager/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.432261 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-7s2r8_f07c10f0-5bec-4421-8ff0-2c659e42377b/kube-rbac-proxy/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.441386 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-7s2r8_f07c10f0-5bec-4421-8ff0-2c659e42377b/manager/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.530884 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9jfsw_1cce914e-3baa-4146-a52c-e054ee0c1eed/manager/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.672651 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5h7t4_1626dead-b9fd-4fae-af93-e2332112626f/kube-rbac-proxy/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.773263 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5h7t4_1626dead-b9fd-4fae-af93-e2332112626f/manager/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.919381 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zh2wq_785b512b-7fa8-4480-b042-3811f10e3659/kube-rbac-proxy/0.log" Dec 05 02:41:07 crc kubenswrapper[4759]: I1205 02:41:07.983819 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-zh2wq_785b512b-7fa8-4480-b042-3811f10e3659/manager/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.005939 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ckxbs_cbc0cab7-b730-4ada-994d-eb8ae2e014df/kube-rbac-proxy/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.203944 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ckxbs_cbc0cab7-b730-4ada-994d-eb8ae2e014df/manager/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.252052 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_2dcdddec-138e-46fd-ab1d-15e4c4a06a15/kube-rbac-proxy/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.279776 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-p5g9w_2dcdddec-138e-46fd-ab1d-15e4c4a06a15/manager/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.510762 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-dcs2p_976da4b0-9b83-4ffe-9cf2-a07c3e149e04/manager/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.537771 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-dcs2p_976da4b0-9b83-4ffe-9cf2-a07c3e149e04/kube-rbac-proxy/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.728364 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dwt5n_9894d9a4-5121-4345-ab9c-4f770f4e4bb0/kube-rbac-proxy/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.773445 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dwt5n_9894d9a4-5121-4345-ab9c-4f770f4e4bb0/manager/0.log" Dec 05 02:41:08 crc kubenswrapper[4759]: I1205 02:41:08.798494 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g_33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9/kube-rbac-proxy/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.007485 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4r5s8g_33f594b2-42dc-4c4e-95e4-ebdfe6c96ee9/manager/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.307420 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lcpcr_627231bc-7c87-4c95-9a7e-ca5c295bfc69/registry-server/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.463548 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7c496d6cb7-z59q2_c02805c1-2950-4e50-9163-a3ca8d5c4319/operator/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.516031 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-j587r_0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da/kube-rbac-proxy/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.740673 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-f9b75_f35362c5-4886-42dc-a633-c018e7f6aaf2/kube-rbac-proxy/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.746482 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-j587r_0bfe4b5d-2f8e-4378-aee7-be0c44e2e3da/manager/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.785998 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-f9b75_f35362c5-4886-42dc-a633-c018e7f6aaf2/manager/0.log" Dec 05 02:41:09 crc kubenswrapper[4759]: I1205 02:41:09.983668 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-85hsm_d832c1ee-6d66-4cd7-87eb-dc2d34f801cc/operator/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.065229 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qxf6v_da012733-6903-4607-9be5-17c81d20ae6b/kube-rbac-proxy/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.177686 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-qxf6v_da012733-6903-4607-9be5-17c81d20ae6b/manager/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.297963 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6578c5f884-gml69_b6222e18-ffef-4dc6-b327-3b06bb91d75a/kube-rbac-proxy/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.538921 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8m9f7_617fefa5-c3f6-450e-a569-8ee3dd12f882/manager/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.543150 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8m9f7_617fefa5-c3f6-450e-a569-8ee3dd12f882/kube-rbac-proxy/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.738926 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6578c5f884-gml69_b6222e18-ffef-4dc6-b327-3b06bb91d75a/manager/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.796336 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-w42fg_e081130e-f14c-489c-9e4e-faab3dbdee6c/kube-rbac-proxy/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.822644 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-w42fg_e081130e-f14c-489c-9e4e-faab3dbdee6c/manager/0.log" Dec 05 02:41:10 crc kubenswrapper[4759]: I1205 02:41:10.910603 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-759bbb976c-dtqzv_4aaabb3d-0c0e-4af8-ae89-5bcb206be1ba/manager/0.log" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.161388 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:18 crc kubenswrapper[4759]: E1205 02:41:18.162392 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b895916-0cc6-4ef0-bc0b-8806cfec023a" containerName="container-00" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.162404 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b895916-0cc6-4ef0-bc0b-8806cfec023a" containerName="container-00" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.162655 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b895916-0cc6-4ef0-bc0b-8806cfec023a" containerName="container-00" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.165237 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.174500 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.271212 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tvz\" (UniqueName: \"kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.271572 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.271629 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.408354 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tvz\" (UniqueName: \"kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.408402 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.408441 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.409632 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.409828 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.428489 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tvz\" (UniqueName: \"kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz\") pod \"redhat-operators-m98bv\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:18 crc kubenswrapper[4759]: I1205 02:41:18.503325 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:19 crc kubenswrapper[4759]: I1205 02:41:19.095723 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:19 crc kubenswrapper[4759]: I1205 02:41:19.763528 4759 generic.go:334] "Generic (PLEG): container finished" podID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerID="2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5" exitCode=0 Dec 05 02:41:19 crc kubenswrapper[4759]: I1205 02:41:19.763720 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerDied","Data":"2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5"} Dec 05 02:41:19 crc kubenswrapper[4759]: I1205 02:41:19.763816 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerStarted","Data":"2d4b20e24d99d94897ae4edbc7b457e551346338e3465125fe6ceca9bc3a54c8"} Dec 05 02:41:20 crc kubenswrapper[4759]: I1205 02:41:20.777556 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerStarted","Data":"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701"} Dec 05 02:41:24 crc kubenswrapper[4759]: I1205 02:41:24.842349 4759 generic.go:334] "Generic (PLEG): container finished" podID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerID="073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701" exitCode=0 Dec 05 02:41:24 crc kubenswrapper[4759]: I1205 02:41:24.842412 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerDied","Data":"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701"} Dec 05 02:41:25 crc kubenswrapper[4759]: I1205 02:41:25.855975 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerStarted","Data":"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9"} Dec 05 02:41:25 crc kubenswrapper[4759]: I1205 02:41:25.887138 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m98bv" podStartSLOduration=2.398666806 podStartE2EDuration="7.887102294s" podCreationTimestamp="2025-12-05 02:41:18 +0000 UTC" firstStartedPulling="2025-12-05 02:41:19.765530723 +0000 UTC m=+8298.981191663" lastFinishedPulling="2025-12-05 02:41:25.253966201 +0000 UTC m=+8304.469627151" observedRunningTime="2025-12-05 02:41:25.883883407 +0000 UTC m=+8305.099544357" watchObservedRunningTime="2025-12-05 02:41:25.887102294 +0000 UTC m=+8305.102763244" Dec 05 02:41:28 crc kubenswrapper[4759]: I1205 02:41:28.504099 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:28 crc kubenswrapper[4759]: I1205 02:41:28.504722 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:29 crc kubenswrapper[4759]: I1205 02:41:29.554727 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m98bv" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" probeResult="failure" output=< Dec 05 02:41:29 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:41:29 crc kubenswrapper[4759]: > Dec 05 02:41:33 crc kubenswrapper[4759]: I1205 02:41:33.780438 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2zx9b_1a9ce926-4d8b-4608-9c75-9ddbc87a2464/control-plane-machine-set-operator/0.log" Dec 05 02:41:33 crc kubenswrapper[4759]: I1205 02:41:33.975969 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gng7x_f2b40743-a414-4dd8-9613-0bc14b937e3d/kube-rbac-proxy/0.log" Dec 05 02:41:33 crc kubenswrapper[4759]: I1205 02:41:33.985819 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gng7x_f2b40743-a414-4dd8-9613-0bc14b937e3d/machine-api-operator/0.log" Dec 05 02:41:39 crc kubenswrapper[4759]: I1205 02:41:39.555016 4759 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m98bv" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" probeResult="failure" output=< Dec 05 02:41:39 crc kubenswrapper[4759]: timeout: failed to connect service ":50051" within 1s Dec 05 02:41:39 crc kubenswrapper[4759]: > Dec 05 02:41:48 crc kubenswrapper[4759]: I1205 02:41:48.558842 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:48 crc kubenswrapper[4759]: I1205 02:41:48.611000 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:49 crc kubenswrapper[4759]: I1205 02:41:49.367027 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:49 crc kubenswrapper[4759]: I1205 02:41:49.446733 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4d8pk_a45999cd-2b52-4802-8bf1-98905eb68923/cert-manager-controller/0.log" Dec 05 02:41:49 crc kubenswrapper[4759]: I1205 02:41:49.646673 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-p4smd_a52b68b3-6d54-414e-8ab9-37788f4ec793/cert-manager-cainjector/0.log" Dec 05 02:41:49 crc kubenswrapper[4759]: I1205 02:41:49.694609 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-h8v7g_2e4a4c71-73fb-4b2d-aa99-aba26e4b4d65/cert-manager-webhook/0.log" Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.156409 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m98bv" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" containerID="cri-o://3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9" gracePeriod=2 Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.750151 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.923799 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content\") pod \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.924075 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6tvz\" (UniqueName: \"kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz\") pod \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.924131 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities\") pod \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\" (UID: \"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9\") " Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.932005 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities" (OuterVolumeSpecName: "utilities") pod "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" (UID: "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:41:50 crc kubenswrapper[4759]: I1205 02:41:50.944530 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz" (OuterVolumeSpecName: "kube-api-access-r6tvz") pod "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" (UID: "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9"). InnerVolumeSpecName "kube-api-access-r6tvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.027596 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.027634 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6tvz\" (UniqueName: \"kubernetes.io/projected/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-kube-api-access-r6tvz\") on node \"crc\" DevicePath \"\"" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.057842 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" (UID: "f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.128746 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.167569 4759 generic.go:334] "Generic (PLEG): container finished" podID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerID="3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9" exitCode=0 Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.167683 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m98bv" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.171921 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerDied","Data":"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9"} Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.171963 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m98bv" event={"ID":"f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9","Type":"ContainerDied","Data":"2d4b20e24d99d94897ae4edbc7b457e551346338e3465125fe6ceca9bc3a54c8"} Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.171985 4759 scope.go:117] "RemoveContainer" containerID="3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.193786 4759 scope.go:117] "RemoveContainer" containerID="073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.223975 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.228188 4759 scope.go:117] "RemoveContainer" containerID="2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.234554 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m98bv"] Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.284884 4759 scope.go:117] "RemoveContainer" containerID="3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9" Dec 05 02:41:51 crc kubenswrapper[4759]: E1205 02:41:51.326386 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9\": container with ID starting with 3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9 not found: ID does not exist" containerID="3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.326445 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9"} err="failed to get container status \"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9\": rpc error: code = NotFound desc = could not find container \"3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9\": container with ID starting with 3f8851eb87f4fb0eafe246aab734f43826c3a0385ce1cd5587fb7bbc79e8c0b9 not found: ID does not exist" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.326473 4759 scope.go:117] "RemoveContainer" containerID="073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701" Dec 05 02:41:51 crc kubenswrapper[4759]: E1205 02:41:51.326925 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701\": container with ID starting with 073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701 not found: ID does not exist" containerID="073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.326967 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701"} err="failed to get container status \"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701\": rpc error: code = NotFound desc = could not find container \"073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701\": container with ID starting with 073e1e79c3e06c7c6a3de8010f296cecaf19427daac6f410b2afea4b92b72701 not found: ID does not exist" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.326987 4759 scope.go:117] "RemoveContainer" containerID="2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5" Dec 05 02:41:51 crc kubenswrapper[4759]: E1205 02:41:51.327247 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5\": container with ID starting with 2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5 not found: ID does not exist" containerID="2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5" Dec 05 02:41:51 crc kubenswrapper[4759]: I1205 02:41:51.327332 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5"} err="failed to get container status \"2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5\": rpc error: code = NotFound desc = could not find container \"2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5\": container with ID starting with 2ce0f02d13465f5085d4ad09753412b43b11365023afde38f6d15598edc161b5 not found: ID does not exist" Dec 05 02:41:53 crc kubenswrapper[4759]: I1205 02:41:53.169399 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" path="/var/lib/kubelet/pods/f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9/volumes" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.239547 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-srcnc_93f7aeec-1ff3-4cec-80b9-683bfda8584b/nmstate-console-plugin/0.log" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.431779 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nsbmm_1c4aa01f-df16-4f20-914f-1238c9c497ab/nmstate-handler/0.log" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.540347 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-hhc6d_463bc80d-5fb1-4bf0-b596-4f41571b3178/nmstate-metrics/0.log" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.541122 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-hhc6d_463bc80d-5fb1-4bf0-b596-4f41571b3178/kube-rbac-proxy/0.log" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.721408 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9ztsr_05de94e3-b61f-4df3-a8f4-a0b97d65b575/nmstate-webhook/0.log" Dec 05 02:42:05 crc kubenswrapper[4759]: I1205 02:42:05.733039 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-w55p6_c0583a6d-7e56-455f-8557-f78732ffd0dc/nmstate-operator/0.log" Dec 05 02:42:20 crc kubenswrapper[4759]: I1205 02:42:20.542236 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/kube-rbac-proxy/0.log" Dec 05 02:42:20 crc kubenswrapper[4759]: I1205 02:42:20.630705 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/manager/0.log" Dec 05 02:42:34 crc kubenswrapper[4759]: I1205 02:42:34.433290 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:42:34 crc kubenswrapper[4759]: I1205 02:42:34.433951 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:42:36 crc kubenswrapper[4759]: I1205 02:42:36.745772 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-j6x8h_bd82eb36-66d9-4938-a5ab-29c36b1f482e/cluster-logging-operator/0.log" Dec 05 02:42:36 crc kubenswrapper[4759]: I1205 02:42:36.921653 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-dx9kr_35a9bf94-4e4a-4d68-95d8-1ae9421bb76f/collector/0.log" Dec 05 02:42:36 crc kubenswrapper[4759]: I1205 02:42:36.960740 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_d4684c5d-5bd7-4500-8f88-1778f47325c3/loki-compactor/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.107572 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-sfxqb_ec954d4c-6908-403f-8241-87a5191ddd17/loki-distributor/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.132134 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-cf7qk_e2fb0fbf-7c9c-4671-af48-6217b781c53d/gateway/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.148355 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-cf7qk_e2fb0fbf-7c9c-4671-af48-6217b781c53d/opa/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.463216 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-gczvw_26e7c80b-666f-472c-8fb4-d3349c69227e/gateway/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.493358 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-99665cbbf-gczvw_26e7c80b-666f-472c-8fb4-d3349c69227e/opa/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.722580 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_c6186de1-0fbc-4432-8bb4-c95e25efe3a7/loki-index-gateway/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.802267 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_e0a9677f-60fe-4bcf-8262-250684b96537/loki-ingester/0.log" Dec 05 02:42:37 crc kubenswrapper[4759]: I1205 02:42:37.950029 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-plh5s_a02b1847-e805-40e3-bbfb-0585e864e6d0/loki-querier/0.log" Dec 05 02:42:38 crc kubenswrapper[4759]: I1205 02:42:38.000986 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-62g2t_f45b1eaf-54f2-400d-996e-95fbaff73750/loki-query-frontend/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.469643 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dk9mn_78f52cea-319f-4493-aa77-b97f1fed1583/kube-rbac-proxy/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.603495 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-dk9mn_78f52cea-319f-4493-aa77-b97f1fed1583/controller/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.638542 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.864994 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.869283 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.869333 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:42:52 crc kubenswrapper[4759]: I1205 02:42:52.871109 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.022703 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.054923 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.078894 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.081624 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.275045 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-reloader/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.277116 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-frr-files/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.309749 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/controller/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.312280 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/cp-metrics/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.434726 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/frr-metrics/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.491421 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/kube-rbac-proxy/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.525793 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/kube-rbac-proxy-frr/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.660189 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/reloader/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.766558 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7kpbw_b81e7b66-fed2-4b1e-8504-22a839862f14/frr-k8s-webhook-server/0.log" Dec 05 02:42:53 crc kubenswrapper[4759]: I1205 02:42:53.948795 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5544dd96f7-h9gmp_f5b08a58-e4f1-4520-aec9-e0f99e93e731/manager/0.log" Dec 05 02:42:54 crc kubenswrapper[4759]: I1205 02:42:54.140537 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6959c5664d-69r4h_161c1f00-7ae1-4d8e-8d03-48b55dd5a8cc/webhook-server/0.log" Dec 05 02:42:54 crc kubenswrapper[4759]: I1205 02:42:54.226827 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wv8kr_b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51/kube-rbac-proxy/0.log" Dec 05 02:42:54 crc kubenswrapper[4759]: I1205 02:42:54.974558 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wv8kr_b2a430bf-ca0c-4f34-a1c5-ca5f5919aa51/speaker/0.log" Dec 05 02:42:55 crc kubenswrapper[4759]: I1205 02:42:55.655807 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2hbsc_1343e9fd-38e9-4285-89a6-f4a15dfca396/frr/0.log" Dec 05 02:43:04 crc kubenswrapper[4759]: I1205 02:43:04.433767 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:43:04 crc kubenswrapper[4759]: I1205 02:43:04.434299 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:43:08 crc kubenswrapper[4759]: I1205 02:43:08.678719 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:43:08 crc kubenswrapper[4759]: I1205 02:43:08.857834 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:43:08 crc kubenswrapper[4759]: I1205 02:43:08.916718 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:43:08 crc kubenswrapper[4759]: I1205 02:43:08.922147 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.074896 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/util/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.089018 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/pull/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.091777 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8vsc9v_5126d26d-8fe3-473d-bd52-52709d0fbb37/extract/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.272213 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.485244 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.503861 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.504714 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.670914 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/pull/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.724579 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/extract/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.730341 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4dlcz_1872bbfd-0448-4ee8-af95-9b4db67c58c9/util/0.log" Dec 05 02:43:09 crc kubenswrapper[4759]: I1205 02:43:09.864059 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.083154 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.088868 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.089704 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.244178 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.255533 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.265911 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210hw2xc_ab2043d5-dc62-4d55-908e-fdae23325471/extract/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.451069 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.603516 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.643253 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.647471 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.777224 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/pull/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.804844 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/util/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.827453 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fzssj2_122b71c8-7720-4f98-b9b6-e68ca39986bf/extract/0.log" Dec 05 02:43:10 crc kubenswrapper[4759]: I1205 02:43:10.969025 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.208228 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.216571 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.271698 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.498103 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/pull/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.498592 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/extract/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.506797 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f834l5hd_ac9a191d-1c37-4695-82d2-d502ed5245ff/util/0.log" Dec 05 02:43:11 crc kubenswrapper[4759]: I1205 02:43:11.864250 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.041420 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.059203 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.071080 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.248148 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-utilities/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.264122 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/extract-content/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.483338 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.704548 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.715669 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:43:12 crc kubenswrapper[4759]: I1205 02:43:12.742936 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.087665 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-utilities/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.144703 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/extract-content/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.319683 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhwcd_5a767665-6baa-48cf-98b9-825fa8ff6b63/registry-server/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.368711 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n7tt2_423b5d0f-1418-420b-80ca-f05d0087c85e/marketplace-operator/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.409344 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.652949 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.652954 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.695834 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.902783 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-content/0.log" Dec 05 02:43:13 crc kubenswrapper[4759]: I1205 02:43:13.953601 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/extract-utilities/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.141036 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.343948 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vqxx5_cc48a8ea-811f-4524-aebb-6518efb9c7f5/registry-server/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.381038 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9l72t_a197c31d-778b-4261-bfd1-0469436747e5/registry-server/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.383110 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.410837 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.448974 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.596336 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-utilities/0.log" Dec 05 02:43:14 crc kubenswrapper[4759]: I1205 02:43:14.604212 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/extract-content/0.log" Dec 05 02:43:15 crc kubenswrapper[4759]: I1205 02:43:15.545137 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7bfqq_af5bad0e-3c28-41e5-bd38-a9251291150c/registry-server/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.423208 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-sc8sd_b21d3e23-0940-4825-801e-ae74255085bd/prometheus-operator/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.585836 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6fd5db955-9ck5c_38b9b1d9-e67c-4aad-a22a-496d348f5148/prometheus-operator-admission-webhook/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.621427 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6fd5db955-w2pkb_94fbcc74-1faa-44a4-8ea9-36028cc96003/prometheus-operator-admission-webhook/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.730676 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-6pmzz_d87236a6-f3c6-470f-a197-05846a9b0c22/operator/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.798489 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-q8g69_3555f68d-fb68-4cf9-91e0-51cc25d2305c/observability-ui-dashboards/0.log" Dec 05 02:43:28 crc kubenswrapper[4759]: I1205 02:43:28.961170 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-qp8vs_96032405-2b01-4177-895c-f26ca2d838a9/perses-operator/0.log" Dec 05 02:43:34 crc kubenswrapper[4759]: I1205 02:43:34.434126 4759 patch_prober.go:28] interesting pod/machine-config-daemon-5q8ns container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 02:43:34 crc kubenswrapper[4759]: I1205 02:43:34.435015 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 02:43:34 crc kubenswrapper[4759]: I1205 02:43:34.435071 4759 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" Dec 05 02:43:34 crc kubenswrapper[4759]: I1205 02:43:34.436011 4759 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e"} pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 02:43:34 crc kubenswrapper[4759]: I1205 02:43:34.436097 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerName="machine-config-daemon" containerID="cri-o://813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" gracePeriod=600 Dec 05 02:43:34 crc kubenswrapper[4759]: E1205 02:43:34.597545 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:43:35 crc kubenswrapper[4759]: I1205 02:43:35.483138 4759 generic.go:334] "Generic (PLEG): container finished" podID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" exitCode=0 Dec 05 02:43:35 crc kubenswrapper[4759]: I1205 02:43:35.483228 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerDied","Data":"813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e"} Dec 05 02:43:35 crc kubenswrapper[4759]: I1205 02:43:35.483520 4759 scope.go:117] "RemoveContainer" containerID="cb796f1932774ddca456258736516c5764a73173d0beca98b07c9eb74be8a3e4" Dec 05 02:43:35 crc kubenswrapper[4759]: I1205 02:43:35.484458 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:43:35 crc kubenswrapper[4759]: E1205 02:43:35.484907 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:43:42 crc kubenswrapper[4759]: I1205 02:43:42.895318 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/kube-rbac-proxy/0.log" Dec 05 02:43:42 crc kubenswrapper[4759]: I1205 02:43:42.913163 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-55cfc66bc8-kk6tz_7c292474-9687-4ff3-a1c3-4dffe9594a36/manager/0.log" Dec 05 02:43:49 crc kubenswrapper[4759]: I1205 02:43:49.156402 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:43:49 crc kubenswrapper[4759]: E1205 02:43:49.157431 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:44:01 crc kubenswrapper[4759]: I1205 02:44:01.170462 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:44:01 crc kubenswrapper[4759]: E1205 02:44:01.171337 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:44:16 crc kubenswrapper[4759]: I1205 02:44:16.158059 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:44:16 crc kubenswrapper[4759]: E1205 02:44:16.158947 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:44:27 crc kubenswrapper[4759]: I1205 02:44:27.155855 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:44:27 crc kubenswrapper[4759]: E1205 02:44:27.156731 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:44:39 crc kubenswrapper[4759]: I1205 02:44:39.156399 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:44:39 crc kubenswrapper[4759]: E1205 02:44:39.157176 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:44:54 crc kubenswrapper[4759]: I1205 02:44:54.157140 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:44:54 crc kubenswrapper[4759]: E1205 02:44:54.158078 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.256261 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l"] Dec 05 02:45:00 crc kubenswrapper[4759]: E1205 02:45:00.257206 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="extract-content" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.257221 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="extract-content" Dec 05 02:45:00 crc kubenswrapper[4759]: E1205 02:45:00.257252 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="extract-utilities" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.257259 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="extract-utilities" Dec 05 02:45:00 crc kubenswrapper[4759]: E1205 02:45:00.257281 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.257286 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.257512 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37a4ca2-3392-4f8d-9413-a0bcd5e1a1b9" containerName="registry-server" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.258348 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.262892 4759 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.270130 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l"] Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.271925 4759 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.334002 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.334204 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lwr\" (UniqueName: \"kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.334282 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.437227 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.437368 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lwr\" (UniqueName: \"kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.437427 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.438601 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.448256 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.464662 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lwr\" (UniqueName: \"kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr\") pod \"collect-profiles-29415045-jtn8l\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:00 crc kubenswrapper[4759]: I1205 02:45:00.588544 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:01 crc kubenswrapper[4759]: I1205 02:45:01.145070 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l"] Dec 05 02:45:01 crc kubenswrapper[4759]: I1205 02:45:01.572705 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" event={"ID":"821396d0-3a0b-4e5d-bc8e-349da4d79a36","Type":"ContainerStarted","Data":"94ce9073e40e002b52473f4aedd751c387b7637e8027a1d2f9a3d0f813f6672e"} Dec 05 02:45:01 crc kubenswrapper[4759]: I1205 02:45:01.573077 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" event={"ID":"821396d0-3a0b-4e5d-bc8e-349da4d79a36","Type":"ContainerStarted","Data":"03af1d910649a6dd18cee2b259f2a740cf95f0e0c6529a513a4d8d2137b2dd2b"} Dec 05 02:45:01 crc kubenswrapper[4759]: I1205 02:45:01.609688 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" podStartSLOduration=1.609664352 podStartE2EDuration="1.609664352s" podCreationTimestamp="2025-12-05 02:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 02:45:01.599703223 +0000 UTC m=+8520.815364173" watchObservedRunningTime="2025-12-05 02:45:01.609664352 +0000 UTC m=+8520.825325312" Dec 05 02:45:02 crc kubenswrapper[4759]: I1205 02:45:02.592932 4759 generic.go:334] "Generic (PLEG): container finished" podID="821396d0-3a0b-4e5d-bc8e-349da4d79a36" containerID="94ce9073e40e002b52473f4aedd751c387b7637e8027a1d2f9a3d0f813f6672e" exitCode=0 Dec 05 02:45:02 crc kubenswrapper[4759]: I1205 02:45:02.592989 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" event={"ID":"821396d0-3a0b-4e5d-bc8e-349da4d79a36","Type":"ContainerDied","Data":"94ce9073e40e002b52473f4aedd751c387b7637e8027a1d2f9a3d0f813f6672e"} Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.067900 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.125222 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume\") pod \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.125377 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume\") pod \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.125497 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8lwr\" (UniqueName: \"kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr\") pod \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\" (UID: \"821396d0-3a0b-4e5d-bc8e-349da4d79a36\") " Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.126259 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume" (OuterVolumeSpecName: "config-volume") pod "821396d0-3a0b-4e5d-bc8e-349da4d79a36" (UID: "821396d0-3a0b-4e5d-bc8e-349da4d79a36"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.133712 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr" (OuterVolumeSpecName: "kube-api-access-d8lwr") pod "821396d0-3a0b-4e5d-bc8e-349da4d79a36" (UID: "821396d0-3a0b-4e5d-bc8e-349da4d79a36"). InnerVolumeSpecName "kube-api-access-d8lwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.134716 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "821396d0-3a0b-4e5d-bc8e-349da4d79a36" (UID: "821396d0-3a0b-4e5d-bc8e-349da4d79a36"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.230145 4759 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/821396d0-3a0b-4e5d-bc8e-349da4d79a36-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.230208 4759 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/821396d0-3a0b-4e5d-bc8e-349da4d79a36-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.230235 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8lwr\" (UniqueName: \"kubernetes.io/projected/821396d0-3a0b-4e5d-bc8e-349da4d79a36-kube-api-access-d8lwr\") on node \"crc\" DevicePath \"\"" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.614501 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" event={"ID":"821396d0-3a0b-4e5d-bc8e-349da4d79a36","Type":"ContainerDied","Data":"03af1d910649a6dd18cee2b259f2a740cf95f0e0c6529a513a4d8d2137b2dd2b"} Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.614556 4759 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03af1d910649a6dd18cee2b259f2a740cf95f0e0c6529a513a4d8d2137b2dd2b" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.614598 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415045-jtn8l" Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.707573 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l"] Dec 05 02:45:04 crc kubenswrapper[4759]: I1205 02:45:04.726144 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415000-zct4l"] Dec 05 02:45:05 crc kubenswrapper[4759]: I1205 02:45:05.178545 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702f6722-3769-4ac1-b98f-4b29a2001133" path="/var/lib/kubelet/pods/702f6722-3769-4ac1-b98f-4b29a2001133/volumes" Dec 05 02:45:09 crc kubenswrapper[4759]: I1205 02:45:09.156762 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:45:09 crc kubenswrapper[4759]: E1205 02:45:09.157707 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:45:21 crc kubenswrapper[4759]: I1205 02:45:21.173595 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:45:21 crc kubenswrapper[4759]: E1205 02:45:21.174473 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:45:34 crc kubenswrapper[4759]: I1205 02:45:34.156035 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:45:34 crc kubenswrapper[4759]: E1205 02:45:34.157070 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:45:39 crc kubenswrapper[4759]: I1205 02:45:39.920348 4759 scope.go:117] "RemoveContainer" containerID="2ac2b28438c6cb82fd9f970f551981dd6199dc36de811ca0cb4e2489eae61285" Dec 05 02:45:39 crc kubenswrapper[4759]: I1205 02:45:39.953608 4759 scope.go:117] "RemoveContainer" containerID="80cd4fd957cba07397718e0988db6c83370a81e26f18f393f202a78652e86e5d" Dec 05 02:45:42 crc kubenswrapper[4759]: I1205 02:45:42.099500 4759 generic.go:334] "Generic (PLEG): container finished" podID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerID="925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399" exitCode=0 Dec 05 02:45:42 crc kubenswrapper[4759]: I1205 02:45:42.099588 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" event={"ID":"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a","Type":"ContainerDied","Data":"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399"} Dec 05 02:45:42 crc kubenswrapper[4759]: I1205 02:45:42.101135 4759 scope.go:117] "RemoveContainer" containerID="925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399" Dec 05 02:45:42 crc kubenswrapper[4759]: I1205 02:45:42.627742 4759 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-8b7bf4bd7-qq45k" podUID="f625a19c-a9af-401d-a834-37a79e3dfeb4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 02:45:42 crc kubenswrapper[4759]: I1205 02:45:42.862461 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k9gnm_must-gather-ccsp9_6f11f324-34a2-4cd0-ac98-bf89a7b0b33a/gather/0.log" Dec 05 02:45:49 crc kubenswrapper[4759]: I1205 02:45:49.156545 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:45:49 crc kubenswrapper[4759]: E1205 02:45:49.157407 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.354682 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k9gnm/must-gather-ccsp9"] Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.355490 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="copy" containerID="cri-o://11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948" gracePeriod=2 Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.364781 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k9gnm/must-gather-ccsp9"] Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.855108 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k9gnm_must-gather-ccsp9_6f11f324-34a2-4cd0-ac98-bf89a7b0b33a/copy/0.log" Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.856429 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.895354 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgxcw\" (UniqueName: \"kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw\") pod \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.895513 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output\") pod \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\" (UID: \"6f11f324-34a2-4cd0-ac98-bf89a7b0b33a\") " Dec 05 02:45:55 crc kubenswrapper[4759]: I1205 02:45:55.905581 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw" (OuterVolumeSpecName: "kube-api-access-cgxcw") pod "6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" (UID: "6f11f324-34a2-4cd0-ac98-bf89a7b0b33a"). InnerVolumeSpecName "kube-api-access-cgxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.006237 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgxcw\" (UniqueName: \"kubernetes.io/projected/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-kube-api-access-cgxcw\") on node \"crc\" DevicePath \"\"" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.203513 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" (UID: "6f11f324-34a2-4cd0-ac98-bf89a7b0b33a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.209343 4759 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.343622 4759 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k9gnm_must-gather-ccsp9_6f11f324-34a2-4cd0-ac98-bf89a7b0b33a/copy/0.log" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.348453 4759 generic.go:334] "Generic (PLEG): container finished" podID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerID="11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948" exitCode=143 Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.348510 4759 scope.go:117] "RemoveContainer" containerID="11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.348591 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k9gnm/must-gather-ccsp9" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.400636 4759 scope.go:117] "RemoveContainer" containerID="925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.450618 4759 scope.go:117] "RemoveContainer" containerID="11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948" Dec 05 02:45:56 crc kubenswrapper[4759]: E1205 02:45:56.451205 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948\": container with ID starting with 11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948 not found: ID does not exist" containerID="11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.451288 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948"} err="failed to get container status \"11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948\": rpc error: code = NotFound desc = could not find container \"11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948\": container with ID starting with 11c842a23250e1dd4511e81b27094498707110180bcf33eb09a135c073580948 not found: ID does not exist" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.451414 4759 scope.go:117] "RemoveContainer" containerID="925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399" Dec 05 02:45:56 crc kubenswrapper[4759]: E1205 02:45:56.452832 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399\": container with ID starting with 925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399 not found: ID does not exist" containerID="925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399" Dec 05 02:45:56 crc kubenswrapper[4759]: I1205 02:45:56.452898 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399"} err="failed to get container status \"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399\": rpc error: code = NotFound desc = could not find container \"925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399\": container with ID starting with 925a77a3709a2615c641a4ef80795f06c3ff50592066c2dbea9c432760f1d399 not found: ID does not exist" Dec 05 02:45:57 crc kubenswrapper[4759]: I1205 02:45:57.170395 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" path="/var/lib/kubelet/pods/6f11f324-34a2-4cd0-ac98-bf89a7b0b33a/volumes" Dec 05 02:46:02 crc kubenswrapper[4759]: I1205 02:46:02.157240 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:46:02 crc kubenswrapper[4759]: E1205 02:46:02.157968 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:46:17 crc kubenswrapper[4759]: I1205 02:46:17.156151 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:46:17 crc kubenswrapper[4759]: E1205 02:46:17.156970 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:46:30 crc kubenswrapper[4759]: I1205 02:46:30.156029 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:46:30 crc kubenswrapper[4759]: E1205 02:46:30.156887 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:46:41 crc kubenswrapper[4759]: I1205 02:46:41.162619 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:46:41 crc kubenswrapper[4759]: E1205 02:46:41.163324 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.597307 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:46:54 crc kubenswrapper[4759]: E1205 02:46:54.598467 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821396d0-3a0b-4e5d-bc8e-349da4d79a36" containerName="collect-profiles" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598484 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="821396d0-3a0b-4e5d-bc8e-349da4d79a36" containerName="collect-profiles" Dec 05 02:46:54 crc kubenswrapper[4759]: E1205 02:46:54.598503 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="gather" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598511 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="gather" Dec 05 02:46:54 crc kubenswrapper[4759]: E1205 02:46:54.598549 4759 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="copy" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598558 4759 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="copy" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598844 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="copy" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598873 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f11f324-34a2-4cd0-ac98-bf89a7b0b33a" containerName="gather" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.598886 4759 memory_manager.go:354] "RemoveStaleState removing state" podUID="821396d0-3a0b-4e5d-bc8e-349da4d79a36" containerName="collect-profiles" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.607580 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.620670 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.710388 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.710479 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98plb\" (UniqueName: \"kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.710558 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.812911 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.813030 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98plb\" (UniqueName: \"kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.813074 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.813614 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.813748 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.847892 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98plb\" (UniqueName: \"kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb\") pod \"certified-operators-9lfgb\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:54 crc kubenswrapper[4759]: I1205 02:46:54.934755 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.416107 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.582896 4759 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.585972 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.602071 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.732227 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.732343 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjzz\" (UniqueName: \"kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.732413 4759 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.834152 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjzz\" (UniqueName: \"kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.834274 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.834425 4759 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.835006 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.835598 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.871082 4759 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjzz\" (UniqueName: \"kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz\") pod \"community-operators-9p2lc\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:55 crc kubenswrapper[4759]: I1205 02:46:55.906516 4759 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.115256 4759 generic.go:334] "Generic (PLEG): container finished" podID="67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" containerID="d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b" exitCode=0 Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.115367 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerDied","Data":"d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b"} Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.116183 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerStarted","Data":"5ee6fca9b814877bf7c5d99830ce0cad8621c19f8d8883a6a278fd68131875dc"} Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.120260 4759 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.156159 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:46:56 crc kubenswrapper[4759]: E1205 02:46:56.157133 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:46:56 crc kubenswrapper[4759]: I1205 02:46:56.375002 4759 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:46:56 crc kubenswrapper[4759]: W1205 02:46:56.379249 4759 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81f06e2c_1996_4186_bac5_b7b74ed0235a.slice/crio-370b88b3005e78e20f1eb449d752b9e21b06acab481453b74c28429349951385 WatchSource:0}: Error finding container 370b88b3005e78e20f1eb449d752b9e21b06acab481453b74c28429349951385: Status 404 returned error can't find the container with id 370b88b3005e78e20f1eb449d752b9e21b06acab481453b74c28429349951385 Dec 05 02:46:57 crc kubenswrapper[4759]: I1205 02:46:57.129548 4759 generic.go:334] "Generic (PLEG): container finished" podID="81f06e2c-1996-4186-bac5-b7b74ed0235a" containerID="f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff" exitCode=0 Dec 05 02:46:57 crc kubenswrapper[4759]: I1205 02:46:57.129625 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerDied","Data":"f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff"} Dec 05 02:46:57 crc kubenswrapper[4759]: I1205 02:46:57.130395 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerStarted","Data":"370b88b3005e78e20f1eb449d752b9e21b06acab481453b74c28429349951385"} Dec 05 02:46:57 crc kubenswrapper[4759]: I1205 02:46:57.134102 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerStarted","Data":"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8"} Dec 05 02:46:58 crc kubenswrapper[4759]: I1205 02:46:58.146286 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerStarted","Data":"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7"} Dec 05 02:46:58 crc kubenswrapper[4759]: I1205 02:46:58.147796 4759 generic.go:334] "Generic (PLEG): container finished" podID="67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" containerID="25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8" exitCode=0 Dec 05 02:46:58 crc kubenswrapper[4759]: I1205 02:46:58.147841 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerDied","Data":"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8"} Dec 05 02:46:59 crc kubenswrapper[4759]: I1205 02:46:59.167894 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerStarted","Data":"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785"} Dec 05 02:46:59 crc kubenswrapper[4759]: I1205 02:46:59.193144 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lfgb" podStartSLOduration=2.678763811 podStartE2EDuration="5.193080331s" podCreationTimestamp="2025-12-05 02:46:54 +0000 UTC" firstStartedPulling="2025-12-05 02:46:56.119356021 +0000 UTC m=+8635.335016971" lastFinishedPulling="2025-12-05 02:46:58.633672541 +0000 UTC m=+8637.849333491" observedRunningTime="2025-12-05 02:46:59.184843061 +0000 UTC m=+8638.400504011" watchObservedRunningTime="2025-12-05 02:46:59.193080331 +0000 UTC m=+8638.408741281" Dec 05 02:47:00 crc kubenswrapper[4759]: I1205 02:47:00.175026 4759 generic.go:334] "Generic (PLEG): container finished" podID="81f06e2c-1996-4186-bac5-b7b74ed0235a" containerID="8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7" exitCode=0 Dec 05 02:47:00 crc kubenswrapper[4759]: I1205 02:47:00.175115 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerDied","Data":"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7"} Dec 05 02:47:01 crc kubenswrapper[4759]: I1205 02:47:01.192027 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerStarted","Data":"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd"} Dec 05 02:47:01 crc kubenswrapper[4759]: I1205 02:47:01.222060 4759 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9p2lc" podStartSLOduration=2.776765954 podStartE2EDuration="6.222030507s" podCreationTimestamp="2025-12-05 02:46:55 +0000 UTC" firstStartedPulling="2025-12-05 02:46:57.131630015 +0000 UTC m=+8636.347291005" lastFinishedPulling="2025-12-05 02:47:00.576894608 +0000 UTC m=+8639.792555558" observedRunningTime="2025-12-05 02:47:01.217560859 +0000 UTC m=+8640.433221859" watchObservedRunningTime="2025-12-05 02:47:01.222030507 +0000 UTC m=+8640.437691497" Dec 05 02:47:04 crc kubenswrapper[4759]: I1205 02:47:04.935459 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:04 crc kubenswrapper[4759]: I1205 02:47:04.936098 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:04 crc kubenswrapper[4759]: I1205 02:47:04.986419 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:05 crc kubenswrapper[4759]: I1205 02:47:05.297172 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:05 crc kubenswrapper[4759]: I1205 02:47:05.373181 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:47:05 crc kubenswrapper[4759]: I1205 02:47:05.906617 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:05 crc kubenswrapper[4759]: I1205 02:47:05.906944 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:05 crc kubenswrapper[4759]: I1205 02:47:05.975971 4759 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:06 crc kubenswrapper[4759]: I1205 02:47:06.315956 4759 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:07 crc kubenswrapper[4759]: I1205 02:47:07.155548 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:47:07 crc kubenswrapper[4759]: E1205 02:47:07.155827 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:47:07 crc kubenswrapper[4759]: I1205 02:47:07.262351 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lfgb" podUID="67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" containerName="registry-server" containerID="cri-o://69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785" gracePeriod=2 Dec 05 02:47:07 crc kubenswrapper[4759]: I1205 02:47:07.773710 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:47:07 crc kubenswrapper[4759]: I1205 02:47:07.891588 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.023185 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98plb\" (UniqueName: \"kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb\") pod \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.023646 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities\") pod \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.023948 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content\") pod \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\" (UID: \"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.024519 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities" (OuterVolumeSpecName: "utilities") pod "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" (UID: "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.024803 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.031642 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb" (OuterVolumeSpecName: "kube-api-access-98plb") pod "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" (UID: "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5"). InnerVolumeSpecName "kube-api-access-98plb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.069938 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" (UID: "67a5a8cd-a48a-4d0b-8528-b0a4461f1af5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.126642 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98plb\" (UniqueName: \"kubernetes.io/projected/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-kube-api-access-98plb\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.126675 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.278172 4759 generic.go:334] "Generic (PLEG): container finished" podID="67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" containerID="69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785" exitCode=0 Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.278249 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lfgb" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.278337 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerDied","Data":"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785"} Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.278462 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lfgb" event={"ID":"67a5a8cd-a48a-4d0b-8528-b0a4461f1af5","Type":"ContainerDied","Data":"5ee6fca9b814877bf7c5d99830ce0cad8621c19f8d8883a6a278fd68131875dc"} Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.278496 4759 scope.go:117] "RemoveContainer" containerID="69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.279529 4759 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9p2lc" podUID="81f06e2c-1996-4186-bac5-b7b74ed0235a" containerName="registry-server" containerID="cri-o://d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd" gracePeriod=2 Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.314362 4759 scope.go:117] "RemoveContainer" containerID="25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.360568 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.363671 4759 scope.go:117] "RemoveContainer" containerID="d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.376880 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lfgb"] Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.536273 4759 scope.go:117] "RemoveContainer" containerID="69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785" Dec 05 02:47:08 crc kubenswrapper[4759]: E1205 02:47:08.536971 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785\": container with ID starting with 69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785 not found: ID does not exist" containerID="69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.537016 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785"} err="failed to get container status \"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785\": rpc error: code = NotFound desc = could not find container \"69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785\": container with ID starting with 69bdb96c2edeea39bc2acf8fb38c68ceeb8b84e5cd38c570120a1aba96d65785 not found: ID does not exist" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.537056 4759 scope.go:117] "RemoveContainer" containerID="25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8" Dec 05 02:47:08 crc kubenswrapper[4759]: E1205 02:47:08.537402 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8\": container with ID starting with 25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8 not found: ID does not exist" containerID="25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.537439 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8"} err="failed to get container status \"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8\": rpc error: code = NotFound desc = could not find container \"25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8\": container with ID starting with 25ecdaabf0f94af988a40b31d83cd766d793bc2d638f9a18f1c004647c4c82e8 not found: ID does not exist" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.537460 4759 scope.go:117] "RemoveContainer" containerID="d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b" Dec 05 02:47:08 crc kubenswrapper[4759]: E1205 02:47:08.537679 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b\": container with ID starting with d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b not found: ID does not exist" containerID="d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.537705 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b"} err="failed to get container status \"d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b\": rpc error: code = NotFound desc = could not find container \"d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b\": container with ID starting with d2467b0831262102b204ad8fdb8052b0cee693c355f33eab98f26aeb4a034b8b not found: ID does not exist" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.827036 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.965355 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjzz\" (UniqueName: \"kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz\") pod \"81f06e2c-1996-4186-bac5-b7b74ed0235a\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.965492 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content\") pod \"81f06e2c-1996-4186-bac5-b7b74ed0235a\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.965635 4759 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities\") pod \"81f06e2c-1996-4186-bac5-b7b74ed0235a\" (UID: \"81f06e2c-1996-4186-bac5-b7b74ed0235a\") " Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.966928 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities" (OuterVolumeSpecName: "utilities") pod "81f06e2c-1996-4186-bac5-b7b74ed0235a" (UID: "81f06e2c-1996-4186-bac5-b7b74ed0235a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:47:08 crc kubenswrapper[4759]: I1205 02:47:08.973754 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz" (OuterVolumeSpecName: "kube-api-access-lxjzz") pod "81f06e2c-1996-4186-bac5-b7b74ed0235a" (UID: "81f06e2c-1996-4186-bac5-b7b74ed0235a"). InnerVolumeSpecName "kube-api-access-lxjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.027689 4759 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81f06e2c-1996-4186-bac5-b7b74ed0235a" (UID: "81f06e2c-1996-4186-bac5-b7b74ed0235a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.068634 4759 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjzz\" (UniqueName: \"kubernetes.io/projected/81f06e2c-1996-4186-bac5-b7b74ed0235a-kube-api-access-lxjzz\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.068667 4759 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.068684 4759 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f06e2c-1996-4186-bac5-b7b74ed0235a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.168018 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a5a8cd-a48a-4d0b-8528-b0a4461f1af5" path="/var/lib/kubelet/pods/67a5a8cd-a48a-4d0b-8528-b0a4461f1af5/volumes" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.291480 4759 generic.go:334] "Generic (PLEG): container finished" podID="81f06e2c-1996-4186-bac5-b7b74ed0235a" containerID="d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd" exitCode=0 Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.291545 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerDied","Data":"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd"} Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.291574 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2lc" event={"ID":"81f06e2c-1996-4186-bac5-b7b74ed0235a","Type":"ContainerDied","Data":"370b88b3005e78e20f1eb449d752b9e21b06acab481453b74c28429349951385"} Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.291601 4759 scope.go:117] "RemoveContainer" containerID="d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.291700 4759 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2lc" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.332603 4759 scope.go:117] "RemoveContainer" containerID="8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.333717 4759 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.345110 4759 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9p2lc"] Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.374473 4759 scope.go:117] "RemoveContainer" containerID="f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.393710 4759 scope.go:117] "RemoveContainer" containerID="d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd" Dec 05 02:47:09 crc kubenswrapper[4759]: E1205 02:47:09.394253 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd\": container with ID starting with d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd not found: ID does not exist" containerID="d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.394284 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd"} err="failed to get container status \"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd\": rpc error: code = NotFound desc = could not find container \"d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd\": container with ID starting with d96d6de0a4a49a2e1cb4513453fa418d920b70cdf360f332a9d7f5d1503c5abd not found: ID does not exist" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.394315 4759 scope.go:117] "RemoveContainer" containerID="8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7" Dec 05 02:47:09 crc kubenswrapper[4759]: E1205 02:47:09.394905 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7\": container with ID starting with 8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7 not found: ID does not exist" containerID="8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.395028 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7"} err="failed to get container status \"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7\": rpc error: code = NotFound desc = could not find container \"8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7\": container with ID starting with 8651232b6c500ecff6b07181cab8cea8511e01ca3e7199b02c441f2d0bdcf5f7 not found: ID does not exist" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.395129 4759 scope.go:117] "RemoveContainer" containerID="f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff" Dec 05 02:47:09 crc kubenswrapper[4759]: E1205 02:47:09.395617 4759 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff\": container with ID starting with f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff not found: ID does not exist" containerID="f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff" Dec 05 02:47:09 crc kubenswrapper[4759]: I1205 02:47:09.395644 4759 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff"} err="failed to get container status \"f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff\": rpc error: code = NotFound desc = could not find container \"f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff\": container with ID starting with f666a0041c6b8334a67a01a5adecde0077606dec77a482101548bb33035d90ff not found: ID does not exist" Dec 05 02:47:11 crc kubenswrapper[4759]: I1205 02:47:11.172893 4759 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f06e2c-1996-4186-bac5-b7b74ed0235a" path="/var/lib/kubelet/pods/81f06e2c-1996-4186-bac5-b7b74ed0235a/volumes" Dec 05 02:47:20 crc kubenswrapper[4759]: I1205 02:47:20.156425 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:47:20 crc kubenswrapper[4759]: E1205 02:47:20.157272 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:47:31 crc kubenswrapper[4759]: I1205 02:47:31.165026 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:47:31 crc kubenswrapper[4759]: E1205 02:47:31.166398 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:47:43 crc kubenswrapper[4759]: I1205 02:47:43.155867 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:47:43 crc kubenswrapper[4759]: E1205 02:47:43.156610 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:47:58 crc kubenswrapper[4759]: I1205 02:47:58.156414 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:47:58 crc kubenswrapper[4759]: E1205 02:47:58.157926 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:48:12 crc kubenswrapper[4759]: I1205 02:48:12.156721 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:48:12 crc kubenswrapper[4759]: E1205 02:48:12.157787 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:48:23 crc kubenswrapper[4759]: I1205 02:48:23.158659 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:48:23 crc kubenswrapper[4759]: E1205 02:48:23.159953 4759 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5q8ns_openshift-machine-config-operator(879c79ed-3fea-4896-84a5-e3c44d13a0c6)\"" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" podUID="879c79ed-3fea-4896-84a5-e3c44d13a0c6" Dec 05 02:48:37 crc kubenswrapper[4759]: I1205 02:48:37.156164 4759 scope.go:117] "RemoveContainer" containerID="813d4806ca04a4ce4fab27df45ad4ab9d890a7b7b3c5752cd63838d077e7262e" Dec 05 02:48:38 crc kubenswrapper[4759]: I1205 02:48:38.453192 4759 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5q8ns" event={"ID":"879c79ed-3fea-4896-84a5-e3c44d13a0c6","Type":"ContainerStarted","Data":"f7c387d37d359407a1053a273b73ccfe8ca696206c15d1803397d218b2c04b8f"}